Dec 03 23:41:03 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 23:41:03 crc restorecon[4758]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:03 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 23:41:04 crc restorecon[4758]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 23:41:04 crc kubenswrapper[4764]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.365833 4764 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369169 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369187 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369192 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369196 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369200 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369204 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369209 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369212 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369218 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369222 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369227 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369234 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369240 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369245 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369249 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369254 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369258 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369262 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369265 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369269 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369273 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369276 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369280 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369284 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369287 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369291 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369295 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369299 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369302 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369306 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369310 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369313 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369317 4764 feature_gate.go:330] unrecognized feature gate: Example Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369321 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369325 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369329 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369332 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369336 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369341 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369345 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369348 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369352 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369356 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369360 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369363 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369368 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369373 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369377 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369387 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369391 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369394 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369397 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369401 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369405 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369408 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369411 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369417 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369422 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369426 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369435 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369439 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369443 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369447 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369450 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369454 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369458 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369461 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369465 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369468 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369472 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.369475 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369548 4764 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369556 4764 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369563 4764 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369569 4764 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369574 4764 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369578 4764 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369584 4764 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369589 4764 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369594 4764 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369598 4764 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369603 4764 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369607 4764 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369611 4764 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369616 4764 flags.go:64] FLAG: --cgroup-root="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369621 4764 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369625 4764 flags.go:64] FLAG: --client-ca-file="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369629 4764 flags.go:64] FLAG: --cloud-config="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369633 4764 flags.go:64] FLAG: --cloud-provider="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369637 4764 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369642 4764 flags.go:64] FLAG: --cluster-domain="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369646 4764 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369650 4764 flags.go:64] FLAG: --config-dir="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369654 4764 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369658 4764 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369664 4764 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369668 4764 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369673 4764 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369677 4764 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369681 4764 flags.go:64] FLAG: --contention-profiling="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369685 4764 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369689 4764 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369693 4764 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369697 4764 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369703 4764 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369707 4764 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369729 4764 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369734 4764 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369738 4764 flags.go:64] FLAG: --enable-server="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369745 4764 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369752 4764 flags.go:64] FLAG: --event-burst="100" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369756 4764 flags.go:64] FLAG: --event-qps="50" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369760 4764 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369764 4764 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369768 4764 flags.go:64] FLAG: --eviction-hard="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369773 4764 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369777 4764 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369781 4764 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369786 4764 flags.go:64] FLAG: --eviction-soft="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369790 4764 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369795 4764 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369800 4764 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369804 4764 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369808 4764 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369812 4764 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369816 4764 flags.go:64] FLAG: --feature-gates="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369826 4764 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369830 4764 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369835 4764 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369840 4764 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369845 4764 flags.go:64] FLAG: --healthz-port="10248" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369851 4764 flags.go:64] FLAG: --help="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369855 4764 flags.go:64] FLAG: --hostname-override="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369859 4764 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369864 4764 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369868 4764 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369873 4764 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369877 4764 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369882 4764 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369886 4764 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369890 4764 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369895 4764 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369900 4764 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369905 4764 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369909 4764 flags.go:64] FLAG: --kube-reserved="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369913 4764 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369917 4764 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369921 4764 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369925 4764 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369930 4764 flags.go:64] FLAG: --lock-file="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369934 4764 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369938 4764 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369942 4764 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369948 4764 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369952 4764 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369956 4764 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369961 4764 flags.go:64] FLAG: --logging-format="text" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369965 4764 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369969 4764 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369973 4764 flags.go:64] FLAG: --manifest-url="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369977 4764 flags.go:64] FLAG: --manifest-url-header="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369983 4764 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369987 4764 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369992 4764 flags.go:64] FLAG: --max-pods="110" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.369996 4764 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370000 4764 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370004 4764 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370008 4764 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370012 4764 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370016 4764 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370020 4764 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370029 4764 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370033 4764 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370038 4764 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370042 4764 flags.go:64] FLAG: --pod-cidr="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370046 4764 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370052 4764 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370056 4764 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370060 4764 flags.go:64] FLAG: --pods-per-core="0" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370064 4764 flags.go:64] FLAG: --port="10250" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370068 4764 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370073 4764 flags.go:64] FLAG: --provider-id="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370077 4764 flags.go:64] FLAG: --qos-reserved="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370081 4764 flags.go:64] FLAG: --read-only-port="10255" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370085 4764 flags.go:64] FLAG: --register-node="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370089 4764 flags.go:64] FLAG: --register-schedulable="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370093 4764 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370100 4764 flags.go:64] FLAG: --registry-burst="10" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370105 4764 flags.go:64] FLAG: --registry-qps="5" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370109 4764 flags.go:64] FLAG: --reserved-cpus="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370113 4764 flags.go:64] FLAG: --reserved-memory="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370119 4764 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370123 4764 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370128 4764 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370132 4764 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370136 4764 flags.go:64] FLAG: --runonce="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370140 4764 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370145 4764 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370149 4764 flags.go:64] FLAG: --seccomp-default="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370153 4764 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370158 4764 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370162 4764 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370166 4764 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370171 4764 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370175 4764 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370180 4764 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370184 4764 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370188 4764 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370192 4764 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370197 4764 flags.go:64] FLAG: --system-cgroups="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370201 4764 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370207 4764 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370211 4764 flags.go:64] FLAG: --tls-cert-file="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370216 4764 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370221 4764 flags.go:64] FLAG: --tls-min-version="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370225 4764 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370229 4764 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370233 4764 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370237 4764 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370242 4764 flags.go:64] FLAG: --v="2" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370247 4764 flags.go:64] FLAG: --version="false" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370253 4764 flags.go:64] FLAG: --vmodule="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370258 4764 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370262 4764 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370377 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370382 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370386 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370390 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370394 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370398 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370401 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370405 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370408 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370412 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370415 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370419 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370422 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370426 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370430 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370433 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370437 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370441 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370444 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370448 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370451 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370455 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370459 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370462 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370466 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370471 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370476 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370480 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370484 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370488 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370495 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370499 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370503 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370507 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370511 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370514 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370518 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370522 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370526 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370529 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370533 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370537 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370540 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370544 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370547 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370552 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370556 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370560 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370563 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370567 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370570 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370573 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370578 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370583 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370586 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370590 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370594 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370597 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370601 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370605 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370608 4764 feature_gate.go:330] unrecognized feature gate: Example Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370612 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370617 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370620 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370624 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370628 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370633 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370637 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370641 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370645 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.370648 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.370660 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.384270 4764 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.384995 4764 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385584 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385610 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385622 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385631 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385642 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385651 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385659 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385667 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385681 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385695 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385706 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385743 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385755 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385767 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385778 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385789 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385798 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385807 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385816 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385824 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385832 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385840 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385848 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385856 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385864 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385874 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385883 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385892 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385900 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385909 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385918 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385925 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385934 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385952 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385961 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385969 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385977 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385984 4764 feature_gate.go:330] unrecognized feature gate: Example Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.385992 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386001 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386009 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386017 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386025 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386032 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386041 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386049 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386057 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386066 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386073 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386082 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386089 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386097 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386105 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386112 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386120 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386129 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386137 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386148 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386158 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386168 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386180 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386189 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386197 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386205 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386212 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386221 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386228 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386236 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386244 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386251 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386259 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.386273 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386529 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386544 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386552 4764 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386563 4764 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386574 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386584 4764 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386593 4764 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386601 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386611 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386621 4764 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386631 4764 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386638 4764 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386647 4764 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386655 4764 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386663 4764 feature_gate.go:330] unrecognized feature gate: Example Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386671 4764 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386679 4764 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386688 4764 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386699 4764 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386709 4764 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386746 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386758 4764 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386769 4764 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386780 4764 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386793 4764 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386803 4764 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386815 4764 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386828 4764 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386839 4764 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386851 4764 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386862 4764 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386871 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386879 4764 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386887 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386895 4764 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386903 4764 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386910 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386933 4764 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386941 4764 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386949 4764 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386957 4764 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386964 4764 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386972 4764 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386979 4764 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386987 4764 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.386995 4764 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387003 4764 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387011 4764 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387020 4764 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387028 4764 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387036 4764 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387044 4764 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387052 4764 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387059 4764 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387067 4764 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387075 4764 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387083 4764 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387094 4764 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387104 4764 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387112 4764 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387122 4764 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387129 4764 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387138 4764 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387145 4764 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387153 4764 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387161 4764 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387169 4764 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387177 4764 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387184 4764 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387192 4764 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.387200 4764 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.387212 4764 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.387876 4764 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.392299 4764 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.392422 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.393499 4764 server.go:997] "Starting client certificate rotation" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.393546 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.393787 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-25 12:38:52.526843798 +0000 UTC Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.393922 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.402918 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.404891 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.406167 4764 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.417503 4764 log.go:25] "Validated CRI v1 runtime API" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.440792 4764 log.go:25] "Validated CRI v1 image API" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.442427 4764 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.445601 4764 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-23-36-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.445666 4764 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.469547 4764 manager.go:217] Machine: {Timestamp:2025-12-03 23:41:04.46721665 +0000 UTC m=+0.228541211 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1697e568-5f3e-4ea7-a9c8-fd5696181e3f BootID:7b7d1078-78f6-4cc3-a0d3-6cc465c742cf Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ec:2d:00 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ec:2d:00 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c6:bc:3c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:af:e1:4d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:81:23:e4 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:cf:92:4e Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:c1:d8:7e Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1a:c0:94:fc:1d:af Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:5a:44:2e:af:cb:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.470219 4764 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.470531 4764 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.471585 4764 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.472110 4764 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.472224 4764 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.472632 4764 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.472691 4764 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.473100 4764 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.473165 4764 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.473523 4764 state_mem.go:36] "Initialized new in-memory state store" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.473671 4764 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.474894 4764 kubelet.go:418] "Attempting to sync node with API server" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.474937 4764 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.474978 4764 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.474999 4764 kubelet.go:324] "Adding apiserver pod source" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.475016 4764 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.477594 4764 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.477841 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.478046 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.478066 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.478250 4764 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.478246 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.479683 4764 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480700 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480768 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480784 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480799 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480821 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480836 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480848 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480872 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480887 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480900 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480938 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.480955 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.481306 4764 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.482071 4764 server.go:1280] "Started kubelet" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.482617 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.483012 4764 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.482440 4764 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 23:41:04 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.484104 4764 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.485875 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.485924 4764 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.486159 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 06:09:57.152906915 +0000 UTC Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.486218 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.486333 4764 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.486347 4764 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.486445 4764 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.486977 4764 server.go:460] "Adding debug handlers to kubelet server" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.487241 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.487337 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.487371 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.487034 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dd909a7303575 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 23:41:04.481990005 +0000 UTC m=+0.243314446,LastTimestamp:2025-12-03 23:41:04.481990005 +0000 UTC m=+0.243314446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.489410 4764 factory.go:55] Registering systemd factory Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.489444 4764 factory.go:221] Registration of the systemd container factory successfully Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.490024 4764 factory.go:153] Registering CRI-O factory Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.490065 4764 factory.go:221] Registration of the crio container factory successfully Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.490168 4764 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.490213 4764 factory.go:103] Registering Raw factory Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.490240 4764 manager.go:1196] Started watching for new ooms in manager Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.491326 4764 manager.go:319] Starting recovery of all containers Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505838 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505902 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505916 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505930 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505945 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.505958 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506004 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506016 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506030 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506041 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506051 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506062 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506074 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506090 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506102 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506113 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506148 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506160 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506173 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506187 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506198 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506211 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506221 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506232 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506244 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506258 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506273 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506286 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506297 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506308 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506319 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506331 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506354 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506365 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506377 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506388 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506400 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506411 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506422 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506435 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506447 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506461 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506474 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506487 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506502 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506536 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506550 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506564 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506579 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506591 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506604 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506616 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506709 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506766 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506781 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506796 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506808 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506820 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506833 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506844 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506857 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506869 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506882 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506896 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506910 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506922 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506935 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506947 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506958 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506971 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506983 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.506995 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507009 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507021 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507034 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507047 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507060 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507071 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507083 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507095 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507108 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507120 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507136 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507147 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507162 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507174 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507184 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507195 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507207 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507218 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507229 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507241 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507252 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507264 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507276 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507287 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507301 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507311 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507322 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507336 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507349 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507360 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507373 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507385 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507402 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507416 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507428 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507441 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507453 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507467 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507478 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507490 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507504 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507516 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.507527 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510015 4764 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510090 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510120 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510143 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510163 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510186 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510207 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510227 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510247 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510268 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510287 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510308 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510325 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510346 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510365 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510387 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510408 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510430 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510452 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510471 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510493 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510512 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510530 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510550 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510569 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510587 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510608 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510629 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510651 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510672 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510692 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510782 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510803 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510826 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510849 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510867 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510888 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510908 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510927 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510947 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510969 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.510990 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511012 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511035 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511055 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511074 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511095 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511116 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511137 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511157 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511177 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511197 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511217 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511239 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511258 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511277 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511298 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511318 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511338 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511359 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511380 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511403 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511425 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511444 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511463 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511483 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511502 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511520 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511540 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511562 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511583 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511602 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511622 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511642 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511660 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511679 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511697 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511756 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511776 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511797 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511832 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511852 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511872 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511890 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511909 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511929 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511950 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511968 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.511987 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512006 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512027 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512046 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512064 4764 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512084 4764 reconstruct.go:97] "Volume reconstruction finished" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512098 4764 reconciler.go:26] "Reconciler: start to sync state" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.512949 4764 manager.go:324] Recovery completed Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.521262 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.523698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.523742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.523755 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.526265 4764 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.526299 4764 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.526333 4764 state_mem.go:36] "Initialized new in-memory state store" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.535997 4764 policy_none.go:49] "None policy: Start" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.536999 4764 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.537019 4764 state_mem.go:35] "Initializing new in-memory state store" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.542647 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.544392 4764 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.544441 4764 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.544469 4764 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.544617 4764 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 23:41:04 crc kubenswrapper[4764]: W1203 23:41:04.546114 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.546191 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.587208 4764 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.587817 4764 manager.go:334] "Starting Device Plugin manager" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.587900 4764 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.587917 4764 server.go:79] "Starting device plugin registration server" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.589341 4764 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.589369 4764 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.589846 4764 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.590048 4764 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.590069 4764 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.599782 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.644933 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.645274 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.646750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.646810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.646828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.647049 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.647339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.647401 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.648497 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.648541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.648556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649079 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649511 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.649992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.650601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.650640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.650656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.650835 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.651029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.651106 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652625 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.652804 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654705 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.654779 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.655828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.655892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.655906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.688194 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.690290 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.692453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.692490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.692521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.692554 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.692976 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714395 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714592 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.714978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.715004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.715048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816535 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816702 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816775 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.816973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.817541 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.820888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821433 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821511 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821646 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.821777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.893758 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.895602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.895666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.895687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:04 crc kubenswrapper[4764]: I1203 23:41:04.895780 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:04 crc kubenswrapper[4764]: E1203 23:41:04.896435 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.000494 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.005696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.022177 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.037678 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-79307828b977638f4d45373b848f01232b2d7b64eaeda8125024a79e30282557 WatchSource:0}: Error finding container 79307828b977638f4d45373b848f01232b2d7b64eaeda8125024a79e30282557: Status 404 returned error can't find the container with id 79307828b977638f4d45373b848f01232b2d7b64eaeda8125024a79e30282557 Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.038481 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-27975c5e7d2e37ceef6fceb13d7656fb47a1c28c72c643197148f62d451d99c4 WatchSource:0}: Error finding container 27975c5e7d2e37ceef6fceb13d7656fb47a1c28c72c643197148f62d451d99c4: Status 404 returned error can't find the container with id 27975c5e7d2e37ceef6fceb13d7656fb47a1c28c72c643197148f62d451d99c4 Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.040092 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.044565 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-2e14baba46274e8435a82d41184b1bd224f4584eb4b7a636da2647f26105637c WatchSource:0}: Error finding container 2e14baba46274e8435a82d41184b1bd224f4584eb4b7a636da2647f26105637c: Status 404 returned error can't find the container with id 2e14baba46274e8435a82d41184b1bd224f4584eb4b7a636da2647f26105637c Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.050078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.063596 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-45cbe463f6af07cdfea0765e09c95a9253b39b4aa5c1cc6136d5b6417282bd59 WatchSource:0}: Error finding container 45cbe463f6af07cdfea0765e09c95a9253b39b4aa5c1cc6136d5b6417282bd59: Status 404 returned error can't find the container with id 45cbe463f6af07cdfea0765e09c95a9253b39b4aa5c1cc6136d5b6417282bd59 Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.086607 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-dd6e92dde7dc5e2421db57d4a088a4a79079cdd96933c79cd880a510868c740a WatchSource:0}: Error finding container dd6e92dde7dc5e2421db57d4a088a4a79079cdd96933c79cd880a510868c740a: Status 404 returned error can't find the container with id dd6e92dde7dc5e2421db57d4a088a4a79079cdd96933c79cd880a510868c740a Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.088953 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.297450 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.300349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.300392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.300403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.300427 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.300953 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.311358 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.311444 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.331883 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.331979 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.483337 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.486945 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:27:43.126786987 +0000 UTC Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.487022 4764 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 409h46m37.639766874s for next certificate rotation Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.536195 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.536410 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.550249 4764 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27" exitCode=0 Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.550331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.550427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"dd6e92dde7dc5e2421db57d4a088a4a79079cdd96933c79cd880a510868c740a"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.550506 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.551562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.551585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.551602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.551987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.552039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"45cbe463f6af07cdfea0765e09c95a9253b39b4aa5c1cc6136d5b6417282bd59"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.554602 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d" exitCode=0 Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.554690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.554732 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2e14baba46274e8435a82d41184b1bd224f4584eb4b7a636da2647f26105637c"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.555019 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.558377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.558404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.558413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.560047 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.560523 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f" exitCode=0 Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.560587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.560647 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"27975c5e7d2e37ceef6fceb13d7656fb47a1c28c72c643197148f62d451d99c4"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.560796 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.561597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.561666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.561682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.562536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.562556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.562566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.563234 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c" exitCode=0 Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.563263 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.563286 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"79307828b977638f4d45373b848f01232b2d7b64eaeda8125024a79e30282557"} Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.563344 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.564064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.564104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:05 crc kubenswrapper[4764]: I1203 23:41:05.564121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:05 crc kubenswrapper[4764]: W1203 23:41:05.669843 4764 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.670269 4764 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Dec 03 23:41:05 crc kubenswrapper[4764]: E1203 23:41:05.892935 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.102076 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.103286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.103356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.103374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.103407 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:06 crc kubenswrapper[4764]: E1203 23:41:06.103968 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.569300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.569363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.569392 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.569525 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.570928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.570973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.570992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.574782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.574826 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.574825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.574969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.582742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.582776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.582785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.585811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.585838 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.585848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.585857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.587278 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8" exitCode=0 Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.587316 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.587397 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.588040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.588059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.588066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.590086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"dae4bdca42055f4db895a31b2f091f4daa189af3019cc5dc0065244f64220792"} Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.590144 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.590741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.590761 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.590769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:06 crc kubenswrapper[4764]: I1203 23:41:06.593172 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.596337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74"} Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.596415 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.597535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.597582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.597596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.599005 4764 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d" exitCode=0 Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.599144 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d"} Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.599172 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.599383 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.600228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.600260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.600272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.600655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.601064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.601090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.704767 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.706151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.706209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.706227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:07 crc kubenswrapper[4764]: I1203 23:41:07.706268 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.606402 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f64ea52f1c788768f5642c3a3e8a9dc1982f28b808e2f01b64623f0ab9930d42"} Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.606457 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.606516 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.606466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c1b58481c4ab461b489b16f6d59f6387aa77ef8e970cd733acff0dd7f6e485f9"} Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.606588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"041271798de9eb2f2ab9014d24b9fea9364f5347713daafb99ac2163438beeff"} Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.607571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.607604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:08 crc kubenswrapper[4764]: I1203 23:41:08.607622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.525891 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.616994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be35b25e472773c9ed1239d4805d0a6dd46a7681a3b0ab830a052bcfa147b21c"} Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.617042 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.617076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f07b3af8030b087d047498877c1f4b294fd0eeaac9ee25a3e6153a0d565d934a"} Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.617125 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.617146 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:09 crc kubenswrapper[4764]: I1203 23:41:09.619392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.334044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.334299 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.335938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.335989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.336007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.619507 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.621389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.621463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:10 crc kubenswrapper[4764]: I1203 23:41:10.621552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.788998 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.789237 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.789297 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.791328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.791394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:11 crc kubenswrapper[4764]: I1203 23:41:11.791413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:12 crc kubenswrapper[4764]: I1203 23:41:12.227656 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 23:41:12 crc kubenswrapper[4764]: I1203 23:41:12.227931 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:12 crc kubenswrapper[4764]: I1203 23:41:12.229792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:12 crc kubenswrapper[4764]: I1203 23:41:12.229865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:12 crc kubenswrapper[4764]: I1203 23:41:12.229889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.953679 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.953959 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.955529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.955590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.955607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:13 crc kubenswrapper[4764]: I1203 23:41:13.977559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.109862 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.110142 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.112083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.112150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.112175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.345544 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.393484 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.402383 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:14 crc kubenswrapper[4764]: E1203 23:41:14.599903 4764 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.631768 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.633180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.633245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:14 crc kubenswrapper[4764]: I1203 23:41:14.633264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:15 crc kubenswrapper[4764]: I1203 23:41:15.635071 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:15 crc kubenswrapper[4764]: I1203 23:41:15.636876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:15 crc kubenswrapper[4764]: I1203 23:41:15.636931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:15 crc kubenswrapper[4764]: I1203 23:41:15.636948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:15 crc kubenswrapper[4764]: I1203 23:41:15.642917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.484602 4764 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 23:41:16 crc kubenswrapper[4764]: E1203 23:41:16.597073 4764 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.637288 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.638108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.638149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.638165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.978494 4764 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 23:41:16 crc kubenswrapper[4764]: I1203 23:41:16.978567 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 23:41:17 crc kubenswrapper[4764]: E1203 23:41:17.494054 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 23:41:17 crc kubenswrapper[4764]: I1203 23:41:17.514140 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 23:41:17 crc kubenswrapper[4764]: I1203 23:41:17.514202 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 23:41:17 crc kubenswrapper[4764]: I1203 23:41:17.517930 4764 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 23:41:17 crc kubenswrapper[4764]: I1203 23:41:17.517998 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.564468 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.564798 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.566445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.566554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.566582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.593837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.645047 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.647066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.647317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.647503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:19 crc kubenswrapper[4764]: I1203 23:41:19.662473 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.648233 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.649683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.649775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.649794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.938828 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 23:41:20 crc kubenswrapper[4764]: I1203 23:41:20.955571 4764 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.797078 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.797387 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.799012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.799071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.799099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:21 crc kubenswrapper[4764]: I1203 23:41:21.805224 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.503296 4764 trace.go:236] Trace[13306372]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 23:41:07.715) (total time: 14787ms): Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[13306372]: ---"Objects listed" error: 14787ms (23:41:22.503) Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[13306372]: [14.787838806s] [14.787838806s] END Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.503338 4764 trace.go:236] Trace[1113706623]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 23:41:07.923) (total time: 14580ms): Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[1113706623]: ---"Objects listed" error: 14580ms (23:41:22.503) Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[1113706623]: [14.580069857s] [14.580069857s] END Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.503370 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.503346 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.503294 4764 trace.go:236] Trace[949516035]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 23:41:07.673) (total time: 14829ms): Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[949516035]: ---"Objects listed" error: 14829ms (23:41:22.503) Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[949516035]: [14.829817625s] [14.829817625s] END Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.504683 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 23:41:22 crc kubenswrapper[4764]: E1203 23:41:22.511025 4764 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.511900 4764 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.512144 4764 trace.go:236] Trace[1041531263]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 23:41:07.851) (total time: 14660ms): Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[1041531263]: ---"Objects listed" error: 14660ms (23:41:22.511) Dec 03 23:41:22 crc kubenswrapper[4764]: Trace[1041531263]: [14.660309169s] [14.660309169s] END Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.512179 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 23:41:22 crc kubenswrapper[4764]: I1203 23:41:22.951897 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.377218 4764 csr.go:261] certificate signing request csr-rxlpd is approved, waiting to be issued Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.383627 4764 csr.go:257] certificate signing request csr-rxlpd is issued Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.485956 4764 apiserver.go:52] "Watching apiserver" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.489225 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.489531 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.490064 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.490507 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490543 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.490562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490597 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.490753 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.494799 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.495005 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.495209 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.495314 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.495436 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.495579 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.503653 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.504161 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.504406 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.526475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.538782 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.551364 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.562870 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.573249 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.582417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.587536 4764 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.594607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.608989 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.617952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618031 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618100 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618164 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618184 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618262 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618399 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618418 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618439 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618455 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618502 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618503 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618810 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618897 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618932 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618967 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.618997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619108 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619157 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619197 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619230 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619250 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619279 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619320 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619304 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619474 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619698 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619777 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619931 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619959 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620062 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620096 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620321 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620339 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620394 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620413 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620432 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620542 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620659 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620794 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620841 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620875 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620902 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620934 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620965 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621096 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621155 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621437 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621547 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621690 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621736 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621765 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621963 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622019 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622046 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622129 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622194 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622251 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622282 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622416 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622569 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622590 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622679 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622699 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622738 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622780 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622885 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622945 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623035 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623060 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623083 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623179 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623256 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623283 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623338 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623413 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623518 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623540 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623559 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623576 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623616 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624341 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624477 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624573 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624763 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624787 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624948 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625006 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625074 4764 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625096 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625112 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625128 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625143 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625159 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625175 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625192 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625207 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625222 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.626807 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.619724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620010 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620272 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620363 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620669 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620823 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.620971 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621081 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621132 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621644 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621692 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621710 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.634208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.634232 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621810 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.621944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622161 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622256 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622493 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622560 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622431 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.622852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623155 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623454 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.623647 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.624777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625050 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625869 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.625986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.626095 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.626965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.626994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627169 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627606 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627738 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.628002 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.628091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.627478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.632577 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.633053 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.633383 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.633376 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.634200 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.634657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.634888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.635017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.635369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.635411 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.635633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.636632 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.636665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.636752 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:24.136701265 +0000 UTC m=+19.898025886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.636859 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.637612 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.637701 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:24.13767759 +0000 UTC m=+19.899002031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.637962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638290 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638301 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638417 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.638862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.639274 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.639384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.639546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.639603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640159 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640161 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640449 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640756 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640796 4764 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.640904 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.641310 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.641596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.641858 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.642461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.642801 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.643409 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.643419 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.643772 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.644166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.644341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.644773 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.645045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.645068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.645355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.645475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.646079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.647975 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648264 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648747 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.648974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.649258 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.649429 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650040 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650385 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650641 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.650704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651058 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651230 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651479 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651626 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.651321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.652904 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653320 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653047 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.653803 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.654142 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.655432 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:24.155398875 +0000 UTC m=+19.916723316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.655842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.656418 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.656790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.661042 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.661081 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.661098 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.661169 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:24.161145883 +0000 UTC m=+19.922470514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.668289 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.668317 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.668335 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.668407 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:24.168381279 +0000 UTC m=+19.929705690 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.668871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.670052 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.671426 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.671750 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.671894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.672765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.674903 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.675106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.676492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.676642 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.676905 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.677112 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.677336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.677974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.678137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.678408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.678962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.679049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.681447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.684535 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.684861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685113 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685478 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685981 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.686562 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.686712 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.685830 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.686239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.687236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.687845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.689148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.689332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.689435 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.692163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.694852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.699754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.702674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.715006 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.715092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.724881 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ck5p6"] Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.725387 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.726982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727139 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727150 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727161 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727171 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727181 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727190 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727199 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727207 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727216 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727225 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727233 4764 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727228 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727242 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727280 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727295 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727308 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727323 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727336 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727347 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727358 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727370 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727381 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727392 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727404 4764 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727433 4764 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727446 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727459 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727471 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727484 4764 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727495 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727509 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727522 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727398 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727535 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727867 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727885 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727898 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727910 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727922 4764 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727936 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727947 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727959 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727970 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727981 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727993 4764 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728005 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728016 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728027 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728037 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728048 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727403 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728059 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728096 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728112 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728123 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.727454 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728505 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728524 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728608 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728621 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728740 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728779 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728790 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728804 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728815 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728826 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728838 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728849 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728861 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728872 4764 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728885 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728897 4764 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728922 4764 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728933 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728944 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728954 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728965 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728975 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728985 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.728996 4764 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729006 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729017 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729030 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729042 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729054 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729066 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729080 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729092 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729105 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729118 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729130 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729143 4764 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729155 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729166 4764 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729178 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729191 4764 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729203 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729215 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729227 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729239 4764 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729250 4764 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729261 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729271 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729283 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729297 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729319 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729332 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729344 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729356 4764 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729368 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729378 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729392 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729404 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729415 4764 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729427 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729438 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729450 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729461 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729471 4764 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729481 4764 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729492 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729503 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729515 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729527 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729538 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729549 4764 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729560 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729574 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729589 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729601 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729615 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729627 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729638 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729650 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729660 4764 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729672 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729687 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729698 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729708 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729739 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729749 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729761 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729771 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729782 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729796 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729807 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.729819 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731611 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731640 4764 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731654 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731666 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731680 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.731696 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734161 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734227 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734248 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734260 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734274 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734287 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734305 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734317 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734328 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734340 4764 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734351 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734361 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734372 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734382 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734396 4764 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734408 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734420 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734450 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734463 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734474 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734484 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734496 4764 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734508 4764 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734520 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734532 4764 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734543 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734553 4764 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734563 4764 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734574 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734588 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734611 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734622 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.734652 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.735021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:41:23 crc kubenswrapper[4764]: E1203 23:41:23.735409 4764 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.753092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.763416 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.791502 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.814015 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.821080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.823941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.828056 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.835915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnrh\" (UniqueName: \"kubernetes.io/projected/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-kube-api-access-sbnrh\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.835988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-hosts-file\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.836039 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.841949 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.876372 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.889225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.905110 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.938021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-hosts-file\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.938087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnrh\" (UniqueName: \"kubernetes.io/projected/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-kube-api-access-sbnrh\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.938197 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-hosts-file\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.957964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnrh\" (UniqueName: \"kubernetes.io/projected/e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df-kube-api-access-sbnrh\") pod \"node-resolver-ck5p6\" (UID: \"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\") " pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.981820 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.985574 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:41:23 crc kubenswrapper[4764]: I1203 23:41:23.993472 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.014804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.028158 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.044607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.054881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ck5p6" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.056570 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.057457 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.074752 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.088241 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.095306 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.103095 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hpltl"] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.103504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.104258 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.105438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.110380 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.110485 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.110421 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.110692 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.120642 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.131496 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.140057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.140299 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.140308 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.140489 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:25.140470471 +0000 UTC m=+20.901794882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.140361 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.140664 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:25.140656336 +0000 UTC m=+20.901980747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.140878 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.150526 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.174142 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.228480 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.241708 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:25.241686472 +0000 UTC m=+21.003010883 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-rootfs\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.241803 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.241820 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.241838 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-proxy-tls\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.241878 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:25.241865077 +0000 UTC m=+21.003189488 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.241923 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xnj\" (UniqueName: \"kubernetes.io/projected/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-kube-api-access-79xnj\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.242015 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.242090 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.242109 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.242218 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:25.242185835 +0000 UTC m=+21.003510246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.248034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.262356 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.300528 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.330443 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.342742 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79xnj\" (UniqueName: \"kubernetes.io/projected/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-kube-api-access-79xnj\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.342839 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-rootfs\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.342873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.342898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-proxy-tls\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.342947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-rootfs\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.343782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.348120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-proxy-tls\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.354103 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.363178 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79xnj\" (UniqueName: \"kubernetes.io/projected/dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4-kube-api-access-79xnj\") pod \"machine-config-daemon-hpltl\" (UID: \"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\") " pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.373592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.384630 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-03 23:36:23 +0000 UTC, rotation deadline is 2026-08-25 15:34:28.639815704 +0000 UTC Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.384692 4764 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6351h53m4.255125667s for next certificate rotation Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.387024 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.393396 4764 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393539 4764 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393688 4764 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393753 4764 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393756 4764 reflector.go:484] object-"openshift-machine-config-operator"/"kube-rbac-proxy": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-rbac-proxy": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393776 4764 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393786 4764 reflector.go:484] object-"openshift-machine-config-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393799 4764 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393799 4764 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393807 4764 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393825 4764 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393844 4764 reflector.go:484] object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393875 4764 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393897 4764 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393917 4764 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393956 4764 reflector.go:484] object-"openshift-machine-config-operator"/"proxy-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-machine-config-operator"/"proxy-tls": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.393979 4764 reflector.go:484] object-"openshift-machine-config-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-machine-config-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.393957 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-operator/pods/network-operator-58b4c7f79c-55gtf/status\": read tcp 38.102.83.13:34728->38.102.83.13:6443: use of closed network connection" Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.394004 4764 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.411004 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.420645 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.425538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.429505 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd3dd2ae_2b58_4de0_8ebe_773d83ac87f4.slice/crio-b86c6a5d516c8b7b986cbca666f677c927be533692c73f173b937eb888c9b763 WatchSource:0}: Error finding container b86c6a5d516c8b7b986cbca666f677c927be533692c73f173b937eb888c9b763: Status 404 returned error can't find the container with id b86c6a5d516c8b7b986cbca666f677c927be533692c73f173b937eb888c9b763 Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.441459 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.455899 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.500078 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jc5ck"] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.500866 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xj964"] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.501142 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.501312 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xc6rn"] Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.501499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.502606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.505671 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506561 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506676 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506695 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506740 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506925 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.506991 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.507080 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.507123 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.507202 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.508260 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.508380 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.508491 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.508812 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.530659 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.544943 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:24 crc kubenswrapper[4764]: E1203 23:41:24.545060 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.547487 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.548657 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.549186 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.550034 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.550734 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.551325 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.552661 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.553211 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.553751 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.554858 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.555450 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.556654 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.557496 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.558603 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.559303 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.560147 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.561381 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.561941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.562110 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.563059 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.563702 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.564464 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.565515 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.566115 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.566579 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.567729 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.568131 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.569317 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.570029 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.570887 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.571537 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.572431 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.576456 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.577032 4764 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.577137 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.579571 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.580208 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.580594 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.582544 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.583214 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.584199 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.584879 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.586067 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.586509 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.587586 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.588198 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.589373 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.589894 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.590974 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.591607 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.591951 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.592676 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.593140 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.593996 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.594593 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.595107 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.596188 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.596637 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.613280 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.642781 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645460 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-kubelet\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645551 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjntt\" (UniqueName: \"kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645567 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-cnibin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645598 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-k8s-cni-cncf-io\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645614 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645653 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.645682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646009 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqxfd\" (UniqueName: \"kubernetes.io/projected/86cae340-1f4f-4e51-a68d-fccd8b8f434a-kube-api-access-gqxfd\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-cni-binary-copy\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646059 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-os-release\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646122 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-bin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646143 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-hostroot\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646164 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-multus-daemon-config\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-multus-certs\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-socket-dir-parent\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-multus\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdm8\" (UniqueName: \"kubernetes.io/projected/8789b456-ab23-4316-880d-5c02242cd3fd-kube-api-access-tzdm8\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-netns\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-etc-kubernetes\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646516 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-system-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646536 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-conf-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646729 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cnibin\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.646762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-os-release\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.649251 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.649297 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.649322 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.660524 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.661431 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6e649194c61d918aaee8521af9921f75a4804a8fec99ef493c56058e8d5d2ca9"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.664149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.664200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.664215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dc82e486bcf57c1cfbb33360790fa93e8c4e248d1081a7c9b90203e8e6601205"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.665285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.665318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c7e6d15726e3d1a27aac3738b0faa52f3b7ac29813429e0a90b61bf4a1485f72"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.666550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.666597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"b86c6a5d516c8b7b986cbca666f677c927be533692c73f173b937eb888c9b763"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.673070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ck5p6" event={"ID":"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df","Type":"ContainerStarted","Data":"5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.673110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ck5p6" event={"ID":"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df","Type":"ContainerStarted","Data":"5c3bba8d9f8d5fea11f0f5a46bbf574eae25adc73565c1fea5ddc8c8c6204ffb"} Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.674538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.690358 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.716636 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-cnibin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-k8s-cni-cncf-io\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750602 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750681 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-cnibin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750740 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-k8s-cni-cncf-io\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqxfd\" (UniqueName: \"kubernetes.io/projected/86cae340-1f4f-4e51-a68d-fccd8b8f434a-kube-api-access-gqxfd\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750847 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750878 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-os-release\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750823 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750892 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-cni-binary-copy\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-bin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-hostroot\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.750921 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-hostroot\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751074 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-multus-daemon-config\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-multus-certs\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751168 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-socket-dir-parent\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-multus\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdm8\" (UniqueName: \"kubernetes.io/projected/8789b456-ab23-4316-880d-5c02242cd3fd-kube-api-access-tzdm8\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-netns\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-etc-kubernetes\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751385 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-system-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-conf-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751470 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751487 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cnibin\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-os-release\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751673 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751730 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-cni-binary-copy\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-bin\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-system-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-kubelet\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-socket-dir-parent\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751849 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-cni-multus\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751870 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751925 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjntt\" (UniqueName: \"kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.751943 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-os-release\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-etc-kubernetes\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cnibin\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752663 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-cni-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752791 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-netns\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-system-cni-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-run-multus-certs\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.752856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-multus-conf-dir\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-os-release\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8789b456-ab23-4316-880d-5c02242cd3fd-host-var-lib-kubelet\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753379 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8789b456-ab23-4316-880d-5c02242cd3fd-multus-daemon-config\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/86cae340-1f4f-4e51-a68d-fccd8b8f434a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753647 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.753877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/86cae340-1f4f-4e51-a68d-fccd8b8f434a-cni-binary-copy\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.758141 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.764288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.787405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqxfd\" (UniqueName: \"kubernetes.io/projected/86cae340-1f4f-4e51-a68d-fccd8b8f434a-kube-api-access-gqxfd\") pod \"multus-additional-cni-plugins-xc6rn\" (UID: \"86cae340-1f4f-4e51-a68d-fccd8b8f434a\") " pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.812053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjntt\" (UniqueName: \"kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt\") pod \"ovnkube-node-jc5ck\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.815016 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.827744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdm8\" (UniqueName: \"kubernetes.io/projected/8789b456-ab23-4316-880d-5c02242cd3fd-kube-api-access-tzdm8\") pod \"multus-xj964\" (UID: \"8789b456-ab23-4316-880d-5c02242cd3fd\") " pod="openshift-multus/multus-xj964" Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.828444 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d56d81d_b8c8_43d2_a678_d34d2ae54e64.slice/crio-ab37222fc019385670800ba50f9ba9a585448678fa5b8ea48f92c60e20ff4306 WatchSource:0}: Error finding container ab37222fc019385670800ba50f9ba9a585448678fa5b8ea48f92c60e20ff4306: Status 404 returned error can't find the container with id ab37222fc019385670800ba50f9ba9a585448678fa5b8ea48f92c60e20ff4306 Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.829787 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" Dec 03 23:41:24 crc kubenswrapper[4764]: W1203 23:41:24.842039 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cae340_1f4f_4e51_a68d_fccd8b8f434a.slice/crio-3d7f52beda21bef9957acd4611b338f46ec1fb1ee89297e13b3d8a71e3f3e04f WatchSource:0}: Error finding container 3d7f52beda21bef9957acd4611b338f46ec1fb1ee89297e13b3d8a71e3f3e04f: Status 404 returned error can't find the container with id 3d7f52beda21bef9957acd4611b338f46ec1fb1ee89297e13b3d8a71e3f3e04f Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.876635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.906945 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.952978 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:24 crc kubenswrapper[4764]: I1203 23:41:24.988294 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.026043 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.058775 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.099297 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.122052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xj964" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.138696 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.154560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.154605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.154736 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.154782 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:27.154769237 +0000 UTC m=+22.916093648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.155037 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.155063 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:27.155056264 +0000 UTC m=+22.916380675 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.183786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.221782 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.232567 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.255802 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.255924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.255949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256075 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256091 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256094 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:27.256054119 +0000 UTC m=+23.017378530 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256103 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256150 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256191 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256207 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256216 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:27.256207603 +0000 UTC m=+23.017532014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.256271 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:27.256252465 +0000 UTC m=+23.017576876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.267494 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.298020 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.308768 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.348531 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.383060 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.418357 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.459556 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.498066 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.545680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.545742 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.545897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.546088 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.547087 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.549860 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.596647 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.607917 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.659194 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.667829 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.677744 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77" exitCode=0 Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.677827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.677887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"ab37222fc019385670800ba50f9ba9a585448678fa5b8ea48f92c60e20ff4306"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.680361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerStarted","Data":"b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.680406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerStarted","Data":"18767ae86cd8c05102a8c96312fbd4a82d03235ed5e4d9488e26cdea67418f56"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.682502 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.683974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerStarted","Data":"7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.684004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerStarted","Data":"3d7f52beda21bef9957acd4611b338f46ec1fb1ee89297e13b3d8a71e3f3e04f"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.711391 4764 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.713646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.713705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.713733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.713861 4764 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.717675 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.728463 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.787619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.809034 4764 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.809301 4764 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.811016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.811052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.811062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.811078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.811091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.829180 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.846303 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.851371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.851405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.851417 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.851433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.851445 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.861270 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.862764 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.866411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.866440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.866452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.866468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.866480 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.867444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.877987 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.881113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.881138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.881150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.881164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.881176 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.888383 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.897191 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.901055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.901087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.901099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.901115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.901127 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.908763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.921205 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:25 crc kubenswrapper[4764]: E1203 23:41:25.921551 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.924190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.924230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.924243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.924263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.924277 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:25Z","lastTransitionTime":"2025-12-03T23:41:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.948940 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 23:41:25 crc kubenswrapper[4764]: I1203 23:41:25.968220 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.002137 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:25Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.008610 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.031805 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.032006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.032025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.032035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.032048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.032057 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.080897 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.117578 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.133997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.134027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.134035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.134051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.134062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.154401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.197136 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.236325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.236352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.236361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.236374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.236382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.242116 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.282858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.319868 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.338340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.338396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.338419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.338442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.338459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.356770 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.398604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.435313 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.441088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.441110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.441119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.441133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.441143 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.479391 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.523098 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.543396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.543425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.543434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.543448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.543458 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.544921 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:26 crc kubenswrapper[4764]: E1203 23:41:26.545159 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.571509 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.604093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.639019 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.646215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.646265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.646288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.646315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.646338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.679866 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.691763 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.694656 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a" exitCode=0 Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.694748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699466 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.699597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.726498 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.748963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.749010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.749026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.749051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.749084 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.762777 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.800254 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.835080 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.851477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.851563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.851593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.851627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.851651 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.876770 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.924865 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.953676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.953733 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.953745 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.953764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.953775 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:26Z","lastTransitionTime":"2025-12-03T23:41:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:26 crc kubenswrapper[4764]: I1203 23:41:26.956888 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:26Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.006369 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.038295 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.055562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.055602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.055612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.055626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.055637 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.075693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.119733 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.155006 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.157512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.157560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.157573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.157591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.157603 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.177786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.177860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.177922 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.177978 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:31.177963838 +0000 UTC m=+26.939288249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.177988 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.178047 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:31.17803056 +0000 UTC m=+26.939355001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.194005 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.240340 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.259835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.259888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.259909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.259934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.259953 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.278893 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.279057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279091 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:31.279060426 +0000 UTC m=+27.040384877 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.279136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279198 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279223 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279239 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279274 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279296 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:31.279281822 +0000 UTC m=+27.040606243 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279299 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279321 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.279370 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:31.279355364 +0000 UTC m=+27.040679815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.286560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.362785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.362830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.362839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.362854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.362864 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.465514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.465555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.465567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.465584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.465595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.545039 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.545052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.545178 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:27 crc kubenswrapper[4764]: E1203 23:41:27.545282 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.567598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.567638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.567651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.567669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.567682 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.670191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.670231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.670243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.670259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.670269 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.706528 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9" exitCode=0 Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.706584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.722568 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.752615 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.768545 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.772368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.772389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.772398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.772412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.772422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.789445 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.808417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.828223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.850804 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.874255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.874297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.874306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.874320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.874330 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.886964 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.905881 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.913780 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bbbnd"] Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.914113 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.916940 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.916993 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.917336 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.917459 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.926762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.940524 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.951156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.967342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.976462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.976731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.976811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.976912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.976985 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:27Z","lastTransitionTime":"2025-12-03T23:41:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.978663 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.985593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90634371-98e6-4e35-9f0c-06da331c8b04-host\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.985643 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90634371-98e6-4e35-9f0c-06da331c8b04-serviceca\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.985665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v298c\" (UniqueName: \"kubernetes.io/projected/90634371-98e6-4e35-9f0c-06da331c8b04-kube-api-access-v298c\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:27 crc kubenswrapper[4764]: I1203 23:41:27.992167 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:27Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.010223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.036887 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.078064 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.079388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.079413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.079425 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.079442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.079455 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.086890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90634371-98e6-4e35-9f0c-06da331c8b04-host\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.086933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90634371-98e6-4e35-9f0c-06da331c8b04-serviceca\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.086952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v298c\" (UniqueName: \"kubernetes.io/projected/90634371-98e6-4e35-9f0c-06da331c8b04-kube-api-access-v298c\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.087025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90634371-98e6-4e35-9f0c-06da331c8b04-host\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.087842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90634371-98e6-4e35-9f0c-06da331c8b04-serviceca\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.123612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v298c\" (UniqueName: \"kubernetes.io/projected/90634371-98e6-4e35-9f0c-06da331c8b04-kube-api-access-v298c\") pod \"node-ca-bbbnd\" (UID: \"90634371-98e6-4e35-9f0c-06da331c8b04\") " pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.137627 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.176188 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.182099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.182140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.182152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.182168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.182178 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.218925 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.226133 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bbbnd" Dec 03 23:41:28 crc kubenswrapper[4764]: W1203 23:41:28.240875 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90634371_98e6_4e35_9f0c_06da331c8b04.slice/crio-f949d74e6a87b2f9edeecac13a0fb9308eddf67f6ee6c8a3e0a474d80a3cc8b5 WatchSource:0}: Error finding container f949d74e6a87b2f9edeecac13a0fb9308eddf67f6ee6c8a3e0a474d80a3cc8b5: Status 404 returned error can't find the container with id f949d74e6a87b2f9edeecac13a0fb9308eddf67f6ee6c8a3e0a474d80a3cc8b5 Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.260478 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.284685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.284748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.284762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.284785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.284799 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.299021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.335475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.378739 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.387805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.387842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.387853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.387869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.387881 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.415527 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.459149 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.490602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.490643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.490654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.490672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.490684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.544947 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:28 crc kubenswrapper[4764]: E1203 23:41:28.545086 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.592600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.592649 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.592660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.592679 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.592691 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.696095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.696130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.696142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.696157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.696168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.715307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.717701 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22" exitCode=0 Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.717781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.720383 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bbbnd" event={"ID":"90634371-98e6-4e35-9f0c-06da331c8b04","Type":"ContainerStarted","Data":"019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.720411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bbbnd" event={"ID":"90634371-98e6-4e35-9f0c-06da331c8b04","Type":"ContainerStarted","Data":"f949d74e6a87b2f9edeecac13a0fb9308eddf67f6ee6c8a3e0a474d80a3cc8b5"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.752204 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.765789 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.782075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.797969 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.798282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.798314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.798329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.798349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.798365 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.818130 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.829744 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.842635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.855655 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.866556 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.880831 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.897709 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.902001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.902048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.902068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.902093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.902115 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:28Z","lastTransitionTime":"2025-12-03T23:41:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.943092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:28 crc kubenswrapper[4764]: I1203 23:41:28.976434 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.004242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.004278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.004287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.004304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.004315 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.019690 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.057481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.099332 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.107164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.107196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.107207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.107223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.107234 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.138551 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.178661 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.209552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.209603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.209615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.209633 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.209645 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.217773 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.259520 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.299083 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.311764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.311807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.311819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.311839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.311851 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.334192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.380746 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.414526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.414562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.414570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.414585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.414596 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.418896 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.456316 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.498499 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.518113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.518145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.518154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.518168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.518177 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.536923 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.545183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.545194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:29 crc kubenswrapper[4764]: E1203 23:41:29.545333 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:29 crc kubenswrapper[4764]: E1203 23:41:29.545382 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.587291 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.620258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.620294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.620305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.620322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.620333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.723655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.723708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.723764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.723792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.723811 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.728883 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8" exitCode=0 Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.728936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.750411 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.772481 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.796392 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.817639 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.827357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.827399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.827414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.827435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.827449 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.841222 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.860474 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.886849 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.908585 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.930650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.930742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.930763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.930797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.930817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:29Z","lastTransitionTime":"2025-12-03T23:41:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.936463 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:29 crc kubenswrapper[4764]: I1203 23:41:29.979182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:29Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.038966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.039191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.039204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.039220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.039233 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.041795 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.086587 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.100314 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.135796 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.141780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.141917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.142085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.142247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.142324 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.245740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.245789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.245801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.245819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.245830 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.348966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.349362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.349531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.349706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.350031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.453132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.453204 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.453222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.453251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.453269 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.544937 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:30 crc kubenswrapper[4764]: E1203 23:41:30.545154 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.556422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.556837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.557040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.557241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.557436 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.672132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.672212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.672234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.672259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.672278 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.736749 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475" exitCode=0 Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.736850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.747873 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.748192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.754626 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.775294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.775878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.775896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.775923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.775941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.780953 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.783175 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.801035 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.822417 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.833102 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.843941 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.858855 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.872015 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.873987 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.879306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.879336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.879348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.879363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.879376 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.882464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.898843 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.910853 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.930391 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.941156 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.951705 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.965279 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.979655 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.981506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.981535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.981546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.981561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.981575 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:30Z","lastTransitionTime":"2025-12-03T23:41:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:30 crc kubenswrapper[4764]: I1203 23:41:30.989924 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:30Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.002182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.014599 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.022538 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.036623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.049601 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.060000 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.084347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.084422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.084546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.084579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.084638 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.100134 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.141678 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.187305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.187343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.187355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.187371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.187382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.193174 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.215980 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.216070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.216170 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.216230 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.216211143 +0000 UTC m=+34.977535564 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.216647 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.216803 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.216785468 +0000 UTC m=+34.978109899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.219533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.263497 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.290844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.290921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.290939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.290966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.290987 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.317533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.317807 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.317700891 +0000 UTC m=+35.079025342 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.317906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.317972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318134 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318174 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318194 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318221 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318254 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318278 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.318254285 +0000 UTC m=+35.079578736 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318279 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.318377 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.318345018 +0000 UTC m=+35.079669519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.394016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.394085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.394109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.394143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.394165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.497673 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.497790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.497810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.497841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.497863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.545267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.545421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.545477 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:31 crc kubenswrapper[4764]: E1203 23:41:31.545689 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.601208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.601247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.601258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.601278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.601289 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.705296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.705349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.705366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.705389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.705407 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.757285 4764 generic.go:334] "Generic (PLEG): container finished" podID="86cae340-1f4f-4e51-a68d-fccd8b8f434a" containerID="16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf" exitCode=0 Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.757678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerDied","Data":"16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.758333 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.776563 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.798525 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.802689 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.808598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.808647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.808656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.808678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.808691 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.830312 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.844961 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.862029 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.877974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.894419 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.908641 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.912052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.912091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.912103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.912121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.912133 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:31Z","lastTransitionTime":"2025-12-03T23:41:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.929006 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.941465 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.954636 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.966030 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.975263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:31 crc kubenswrapper[4764]: I1203 23:41:31.988900 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:31Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.002762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.014219 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.015527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.015599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.015614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.015634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.015649 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.026217 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.037048 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.046391 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.054592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.096629 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.117994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.118031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.118043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.118060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.118072 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.142675 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.174393 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.218528 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.221333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.221399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.221440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.221472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.221496 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.262345 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.307891 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.325252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.325307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.325322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.325340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.325354 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.345187 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.391653 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.428401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.428463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.428484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.428509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.428529 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.532128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.532190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.532209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.532231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.532249 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.545636 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:32 crc kubenswrapper[4764]: E1203 23:41:32.545818 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.635496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.635547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.635564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.635584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.635601 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.739448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.739529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.739554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.739584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.739602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.769648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" event={"ID":"86cae340-1f4f-4e51-a68d-fccd8b8f434a","Type":"ContainerStarted","Data":"914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.791401 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.814012 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.837217 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.842960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.843027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.843048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.843073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.843092 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.858157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.880128 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.901792 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.924646 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946411 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:32Z","lastTransitionTime":"2025-12-03T23:41:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.946436 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.970761 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.983328 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:32 crc kubenswrapper[4764]: I1203 23:41:32.995872 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:32Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.009396 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:33Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.021197 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:33Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.031921 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:33Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.048864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.048920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.048931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.048946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.048956 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.151233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.151270 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.151281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.151300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.151310 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.255455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.255512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.255522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.255536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.255567 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.358874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.358920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.358932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.358950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.358962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.462126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.462181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.462201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.462226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.462245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.545254 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.545311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:33 crc kubenswrapper[4764]: E1203 23:41:33.545428 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:33 crc kubenswrapper[4764]: E1203 23:41:33.545562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.564877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.564949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.564971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.564996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.565013 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.668146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.668506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.668517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.668542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.668554 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.772295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.772354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.772370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.772399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.772415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.875959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.876008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.876020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.876040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.876052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.979276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.979364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.979391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.979426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:33 crc kubenswrapper[4764]: I1203 23:41:33.979449 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:33Z","lastTransitionTime":"2025-12-03T23:41:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.082595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.082667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.082683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.082710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.082769 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.186285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.186348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.186389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.186415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.186433 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.289531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.289588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.289612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.289642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.289665 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.393357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.393409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.393428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.393455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.393473 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.495657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.495740 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.495757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.495782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.495795 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.544954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:34 crc kubenswrapper[4764]: E1203 23:41:34.545085 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.557059 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.569669 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.587869 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.597910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.598076 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.598087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.598109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.598121 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.609170 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.623788 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.646694 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.665220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.681660 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.695604 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.701001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.701257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.701321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.701406 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.701470 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.708830 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.721537 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.735950 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.753962 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.765609 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.779312 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/0.log" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.783211 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1" exitCode=1 Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.783393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.786442 4764 scope.go:117] "RemoveContainer" containerID="631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.804565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.804942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.804995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.805017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.805047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.805070 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.826618 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.848437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:33Z\\\",\\\"message\\\":\\\"17 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:33.623039 6017 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623187 6017 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623420 6017 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.624864 6017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 23:41:33.624983 6017 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 23:41:33.625000 6017 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 23:41:33.625050 6017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:33.625077 6017 factory.go:656] Stopping watch factory\\\\nI1203 23:41:33.625066 6017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:41:33.625097 6017 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:41:33.625104 6017 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:41:33.625128 6017 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:41:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.862535 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.877854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.893592 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.907138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.907177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.907190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.907209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.907221 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:34Z","lastTransitionTime":"2025-12-03T23:41:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.911191 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.922844 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.938182 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.949536 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.961476 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.977221 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:34 crc kubenswrapper[4764]: I1203 23:41:34.993638 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.003113 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.013884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.013930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.013942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.013958 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.013971 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.116843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.116889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.116906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.116928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.116945 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.219516 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.219548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.219556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.219569 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.219579 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.321297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.321323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.321331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.321343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.321351 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.423275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.423315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.423326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.423342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.423354 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.526825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.526886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.526904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.526928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.526944 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.580435 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:35 crc kubenswrapper[4764]: E1203 23:41:35.580579 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.580954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:35 crc kubenswrapper[4764]: E1203 23:41:35.581044 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.581129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:35 crc kubenswrapper[4764]: E1203 23:41:35.581215 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.629616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.629661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.629673 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.629690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.629702 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.732055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.732105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.732118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.732136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.732148 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.788658 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/0.log" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.791056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.791636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.803034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.814040 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.825452 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.834789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.834825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.834834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.834850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.834863 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.846411 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.859063 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.874498 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.888489 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.908454 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:33Z\\\",\\\"message\\\":\\\"17 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:33.623039 6017 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623187 6017 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623420 6017 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.624864 6017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 23:41:33.624983 6017 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 23:41:33.625000 6017 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 23:41:33.625050 6017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:33.625077 6017 factory.go:656] Stopping watch factory\\\\nI1203 23:41:33.625066 6017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:41:33.625097 6017 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:41:33.625104 6017 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:41:33.625128 6017 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:41:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.922386 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.936437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.936461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.936470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.936484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.936496 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:35Z","lastTransitionTime":"2025-12-03T23:41:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.940082 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.956620 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.970014 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:35 crc kubenswrapper[4764]: I1203 23:41:35.988747 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.001614 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:35Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.038427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.038487 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.038506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.038530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.038547 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.141399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.141462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.141482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.141506 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.141524 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.243946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.244002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.244017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.244035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.244064 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.302605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.302666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.302684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.302711 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.302763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.324618 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.330333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.330397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.330422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.330455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.330478 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.351655 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.356974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.357020 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.357038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.357065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.357083 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.378137 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.384308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.384362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.384380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.384404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.384423 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.406119 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.411354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.411442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.411462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.411499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.411517 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.432401 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.432628 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.435313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.435365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.435383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.435438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.435457 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.538165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.538224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.538241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.538264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.538283 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.638559 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt"] Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.639524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.642497 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643222 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.643525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.666155 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.687951 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.706321 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.729192 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.746416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.746486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.746508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.746541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.746563 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.754044 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.771204 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.788297 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.792806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-env-overrides\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.792964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40140be4-0168-4866-a807-92f9f0d89a32-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.793052 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqbfx\" (UniqueName: \"kubernetes.io/projected/40140be4-0168-4866-a807-92f9f0d89a32-kube-api-access-dqbfx\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.793142 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.799035 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/1.log" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.799834 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/0.log" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.804387 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9" exitCode=1 Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.804510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.804708 4764 scope.go:117] "RemoveContainer" containerID="631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.805649 4764 scope.go:117] "RemoveContainer" containerID="cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9" Dec 03 23:41:36 crc kubenswrapper[4764]: E1203 23:41:36.805969 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.820124 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.838533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.848897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.848957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.848974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.849003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.849021 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.855075 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.872506 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.889302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.894464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqbfx\" (UniqueName: \"kubernetes.io/projected/40140be4-0168-4866-a807-92f9f0d89a32-kube-api-access-dqbfx\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.894538 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.894585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-env-overrides\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.894653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40140be4-0168-4866-a807-92f9f0d89a32-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.895460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-env-overrides\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.895994 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/40140be4-0168-4866-a807-92f9f0d89a32-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.902700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/40140be4-0168-4866-a807-92f9f0d89a32-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.915845 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqbfx\" (UniqueName: \"kubernetes.io/projected/40140be4-0168-4866-a807-92f9f0d89a32-kube-api-access-dqbfx\") pod \"ovnkube-control-plane-749d76644c-94tkt\" (UID: \"40140be4-0168-4866-a807-92f9f0d89a32\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.919061 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:33Z\\\",\\\"message\\\":\\\"17 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:33.623039 6017 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623187 6017 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623420 6017 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.624864 6017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 23:41:33.624983 6017 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 23:41:33.625000 6017 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 23:41:33.625050 6017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:33.625077 6017 factory.go:656] Stopping watch factory\\\\nI1203 23:41:33.625066 6017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:41:33.625097 6017 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:41:33.625104 6017 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:41:33.625128 6017 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:41:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.941189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.951791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.951820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.951829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.951843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.951853 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:36Z","lastTransitionTime":"2025-12-03T23:41:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.960293 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.965010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" Dec 03 23:41:36 crc kubenswrapper[4764]: I1203 23:41:36.980647 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:36 crc kubenswrapper[4764]: W1203 23:41:36.992482 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40140be4_0168_4866_a807_92f9f0d89a32.slice/crio-fa94d4360ef5bad098019b7efa7dd6b0bcb48e3229181142c3d3d7197a0f1dfb WatchSource:0}: Error finding container fa94d4360ef5bad098019b7efa7dd6b0bcb48e3229181142c3d3d7197a0f1dfb: Status 404 returned error can't find the container with id fa94d4360ef5bad098019b7efa7dd6b0bcb48e3229181142c3d3d7197a0f1dfb Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.000265 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:36Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.030309 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://631f2e547c7eb351373177f9caa44a77244de16bf7a195ab8bf299c4316f6df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:33Z\\\",\\\"message\\\":\\\"17 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:33.623039 6017 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623187 6017 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.623420 6017 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 23:41:33.624864 6017 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 23:41:33.624983 6017 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 23:41:33.625000 6017 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 23:41:33.625050 6017 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:33.625077 6017 factory.go:656] Stopping watch factory\\\\nI1203 23:41:33.625066 6017 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:41:33.625097 6017 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:41:33.625104 6017 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:41:33.625128 6017 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:41:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.051071 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.057510 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.057538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.057547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.057560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.057573 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.070982 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.092458 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.111389 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.126841 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.143504 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.160470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.160499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.160508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.160520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.160530 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.162018 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.180058 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.197843 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.213427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.226540 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.242184 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.263684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.263988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.264000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.264018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.264031 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.368540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.368600 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.368613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.368634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.368644 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.472013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.472071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.472092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.472119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.472144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.544838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.544856 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.544860 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:37 crc kubenswrapper[4764]: E1203 23:41:37.544993 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:37 crc kubenswrapper[4764]: E1203 23:41:37.545250 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:37 crc kubenswrapper[4764]: E1203 23:41:37.545376 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.575936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.576005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.576031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.576060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.576082 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.679558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.679623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.679641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.679666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.679684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.783408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.783521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.783540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.783565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.783592 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.809975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" event={"ID":"40140be4-0168-4866-a807-92f9f0d89a32","Type":"ContainerStarted","Data":"fa94d4360ef5bad098019b7efa7dd6b0bcb48e3229181142c3d3d7197a0f1dfb"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.813272 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/1.log" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.820279 4764 scope.go:117] "RemoveContainer" containerID="cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9" Dec 03 23:41:37 crc kubenswrapper[4764]: E1203 23:41:37.820604 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.843807 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.867671 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.885546 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.889164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.889214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.889239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.889264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.889282 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.909798 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.928997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.946503 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.965746 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.981868 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.992654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.992704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.992741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.992762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:37 crc kubenswrapper[4764]: I1203 23:41:37.992777 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:37Z","lastTransitionTime":"2025-12-03T23:41:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:37.999963 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:37Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.018894 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.035507 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.055223 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.083622 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.095029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.095070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.095084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.095100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.095112 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.105880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.125400 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.134174 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-9fkg4"] Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.134586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: E1203 23:41:38.134640 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.156301 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.170048 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.181885 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.198054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.198096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.198108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.198124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.198138 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.199025 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.210967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.211002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gfvt\" (UniqueName: \"kubernetes.io/projected/acd1bf47-f475-47f3-95a7-2e0cecec15aa-kube-api-access-4gfvt\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.221567 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.232453 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.247534 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.263635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.275584 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.289607 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.300622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.300671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.300682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.300700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.300712 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.311353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.311398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gfvt\" (UniqueName: \"kubernetes.io/projected/acd1bf47-f475-47f3-95a7-2e0cecec15aa-kube-api-access-4gfvt\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: E1203 23:41:38.311839 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:38 crc kubenswrapper[4764]: E1203 23:41:38.311894 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:41:38.811877338 +0000 UTC m=+34.573201759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.312315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.334927 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.338482 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gfvt\" (UniqueName: \"kubernetes.io/projected/acd1bf47-f475-47f3-95a7-2e0cecec15aa-kube-api-access-4gfvt\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.351933 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.363490 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.377858 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.390619 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.403480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.403528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.403546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.403571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.403589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.506543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.506580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.506590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.506631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.506644 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.609534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.609601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.609624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.609653 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.609675 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.720903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.720968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.720991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.721049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.721074 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.816235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:38 crc kubenswrapper[4764]: E1203 23:41:38.816553 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:38 crc kubenswrapper[4764]: E1203 23:41:38.816709 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:41:39.81666152 +0000 UTC m=+35.577986031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.823685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.823777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.823807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.823850 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.823876 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.826937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" event={"ID":"40140be4-0168-4866-a807-92f9f0d89a32","Type":"ContainerStarted","Data":"5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.827010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" event={"ID":"40140be4-0168-4866-a807-92f9f0d89a32","Type":"ContainerStarted","Data":"bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.846680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.867248 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.893978 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.918590 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.926754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.926836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.926857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.926882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.926900 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:38Z","lastTransitionTime":"2025-12-03T23:41:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.943126 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.964811 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.983125 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:38 crc kubenswrapper[4764]: I1203 23:41:38.998828 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.014229 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.029329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.029361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.029369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.029381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.029389 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.030096 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.050930 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.066160 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.078206 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.094803 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.111029 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.125549 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:39Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.131568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.131611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.131628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.131651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.131668 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.221798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.221880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.222034 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.222058 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.222114 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:55.222098149 +0000 UTC m=+50.983422570 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.222177 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:55.22214628 +0000 UTC m=+50.983470731 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.236565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.236610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.236621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.236638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.236653 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.322809 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323037 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:41:55.323006402 +0000 UTC m=+51.084330843 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.323221 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.323288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323477 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323511 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323514 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323530 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323553 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323579 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323609 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:55.323590367 +0000 UTC m=+51.084914808 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.323664 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:41:55.323636968 +0000 UTC m=+51.084961429 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.345026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.345089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.345109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.345136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.345156 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.448280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.448349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.448368 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.448394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.448413 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.545639 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.545711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.545652 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.545857 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.545968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.546121 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.552070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.552139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.552164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.552191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.552217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.656035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.656090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.656107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.656131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.656150 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.759157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.759227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.759243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.759269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.759287 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.829248 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: E1203 23:41:39.829368 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:41:41.829335374 +0000 UTC m=+37.590659825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.829001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.862311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.862360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.862376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.862399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.862417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.966231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.966307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.966326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.966348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:39 crc kubenswrapper[4764]: I1203 23:41:39.966366 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:39Z","lastTransitionTime":"2025-12-03T23:41:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.069693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.069788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.069806 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.069828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.069845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.173200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.173260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.173281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.173308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.173329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.276367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.276424 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.276441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.276465 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.276484 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.379831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.381012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.381377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.381672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.381867 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.484673 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.484909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.484967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.485023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.485074 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.544883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:40 crc kubenswrapper[4764]: E1203 23:41:40.545047 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.587461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.587890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.588067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.588202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.588341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.691358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.691411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.691428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.691452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.691470 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.794597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.794652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.794675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.794702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.794756 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.897889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.898226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.898430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.898620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:40 crc kubenswrapper[4764]: I1203 23:41:40.898834 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:40Z","lastTransitionTime":"2025-12-03T23:41:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.003212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.003275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.003293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.003317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.003335 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.106801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.106902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.106921 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.106944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.106960 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.209778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.210082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.210220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.210346 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.210488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.317125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.317234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.317263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.317288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.317306 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.420669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.421452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.421530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.421625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.421688 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.524866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.524937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.524957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.525009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.525039 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.545594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.545612 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:41 crc kubenswrapper[4764]: E1203 23:41:41.545831 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:41 crc kubenswrapper[4764]: E1203 23:41:41.545993 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.545618 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:41 crc kubenswrapper[4764]: E1203 23:41:41.546435 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.628767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.629168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.629354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.629617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.629794 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.733000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.733365 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.733960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.734349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.734662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.837483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.837533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.837549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.837570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.837586 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.854919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:41 crc kubenswrapper[4764]: E1203 23:41:41.855137 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:41 crc kubenswrapper[4764]: E1203 23:41:41.855254 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:41:45.855223835 +0000 UTC m=+41.616548276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.940046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.940129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.940153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.940184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:41 crc kubenswrapper[4764]: I1203 23:41:41.940212 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:41Z","lastTransitionTime":"2025-12-03T23:41:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.043509 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.043937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.044120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.044263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.044408 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.148005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.148048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.148062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.148080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.148092 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.251092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.251165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.251192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.251225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.251251 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.354748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.354826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.354844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.354907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.354927 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.457932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.458575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.458677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.458817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.458923 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.545489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:42 crc kubenswrapper[4764]: E1203 23:41:42.546056 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.561668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.561754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.561775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.561805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.562125 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.664931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.665027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.665047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.665073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.665091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.767790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.767848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.767865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.767891 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.767910 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.870336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.870381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.870398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.870421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.870439 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.972942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.972982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.972998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.973022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:42 crc kubenswrapper[4764]: I1203 23:41:42.973039 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:42Z","lastTransitionTime":"2025-12-03T23:41:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.076115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.076183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.076208 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.076241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.076263 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.179683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.179782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.179818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.179848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.179870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.282900 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.282976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.282991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.283023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.283043 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.386032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.386096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.386120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.386151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.386173 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.489683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.489800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.489843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.489875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.489897 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.544709 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.544808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.544709 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:43 crc kubenswrapper[4764]: E1203 23:41:43.545029 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:43 crc kubenswrapper[4764]: E1203 23:41:43.545157 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:43 crc kubenswrapper[4764]: E1203 23:41:43.545294 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.593093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.593154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.593172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.593197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.593215 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.696905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.696981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.697003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.697028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.697045 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.800252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.800332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.800358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.800385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.800404 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.904164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.904245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.904269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.904301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:43 crc kubenswrapper[4764]: I1203 23:41:43.904326 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:43Z","lastTransitionTime":"2025-12-03T23:41:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.007372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.007435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.007453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.007480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.007500 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.110067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.110125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.110143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.110169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.110187 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.214280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.214341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.214362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.214393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.214419 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.318872 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.318927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.318943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.318967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.318986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.421529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.421583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.421597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.421617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.421632 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.525176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.525532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.525549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.525575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.525595 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.545329 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:44 crc kubenswrapper[4764]: E1203 23:41:44.545536 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.570693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.591894 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.615407 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.628378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.628430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.628447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.628471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.628491 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.637206 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.655179 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.671569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.689707 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.713347 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.731094 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.731853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.731984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.732090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.732185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.732268 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.749252 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.767867 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.784645 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.799548 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.819514 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.835643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.835689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.835705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.835752 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.835769 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.837914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.854560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.938773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.938841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.938859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.938884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:44 crc kubenswrapper[4764]: I1203 23:41:44.938903 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:44Z","lastTransitionTime":"2025-12-03T23:41:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.042050 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.042144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.042162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.042193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.042210 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.144408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.144881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.144998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.145149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.145247 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.247686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.247784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.247805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.247831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.247848 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.350624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.350686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.350706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.350777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.350798 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.453782 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.453863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.453888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.453920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.453941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.545024 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.545052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:45 crc kubenswrapper[4764]: E1203 23:41:45.545200 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.545280 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:45 crc kubenswrapper[4764]: E1203 23:41:45.545405 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:45 crc kubenswrapper[4764]: E1203 23:41:45.545499 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.557065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.557118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.557135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.557157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.557174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.660462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.660524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.660564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.660599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.660622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.764013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.764088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.764118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.764151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.764174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.866471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.866537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.866564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.866597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.866620 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.899910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:45 crc kubenswrapper[4764]: E1203 23:41:45.900123 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:45 crc kubenswrapper[4764]: E1203 23:41:45.900282 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:41:53.900223624 +0000 UTC m=+49.661548135 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.970449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.970517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.970536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.970562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:45 crc kubenswrapper[4764]: I1203 23:41:45.970587 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:45Z","lastTransitionTime":"2025-12-03T23:41:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.074011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.074085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.074104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.074129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.074148 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.178161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.178253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.178276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.178301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.178358 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.281501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.281562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.281580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.281602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.281619 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.385063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.385124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.385141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.385166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.385184 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.488182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.488259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.488287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.488316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.488338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.545557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.545823 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.591246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.591283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.591295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.591311 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.591322 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.694873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.694937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.694956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.694983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.695004 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.761117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.761174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.761191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.761215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.761232 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.781666 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:46Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.787170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.787245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.787271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.787300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.787318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.807908 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:46Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.813168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.813225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.813244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.813267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.813285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.834274 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:46Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.839568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.839626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.839646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.839669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.839689 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.865343 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:46Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.871138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.871206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.871225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.871253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.871270 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.886283 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:46Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:46 crc kubenswrapper[4764]: E1203 23:41:46.886538 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.888864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.888893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.888907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.888928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.888944 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.991992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.992065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.992083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.992107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:46 crc kubenswrapper[4764]: I1203 23:41:46.992125 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:46Z","lastTransitionTime":"2025-12-03T23:41:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.095300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.095367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.095389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.095419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.095440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.198310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.198382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.198408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.198439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.198462 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.301892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.301951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.301974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.302003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.302026 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.405980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.406052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.406071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.406097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.406130 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.509242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.509344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.509366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.509392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.509408 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.545667 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.545967 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.546006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:47 crc kubenswrapper[4764]: E1203 23:41:47.547055 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:47 crc kubenswrapper[4764]: E1203 23:41:47.547140 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:47 crc kubenswrapper[4764]: E1203 23:41:47.547550 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.613174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.613520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.613975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.614391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.614637 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.718438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.718762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.718922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.719105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.719251 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.822836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.823185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.823337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.823501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.823640 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.927087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.927410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.927551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.927773 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:47 crc kubenswrapper[4764]: I1203 23:41:47.928191 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:47Z","lastTransitionTime":"2025-12-03T23:41:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.031848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.032262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.032421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.032549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.032889 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.136197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.136261 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.136285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.136316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.136338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.239293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.239396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.239415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.239450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.239475 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.342886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.342953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.342972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.342997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.343014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.446223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.446288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.446303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.446327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.446342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.545114 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:48 crc kubenswrapper[4764]: E1203 23:41:48.545366 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.549454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.550439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.550666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.550932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.551074 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.654199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.654271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.654295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.654323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.654344 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.757104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.757128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.757136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.757149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.757158 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.860001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.860078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.860098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.860122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.860140 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.963225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.963566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.963821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.964055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:48 crc kubenswrapper[4764]: I1203 23:41:48.964273 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:48Z","lastTransitionTime":"2025-12-03T23:41:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.067812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.067857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.067875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.067898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.067915 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.170672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.170772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.170795 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.170827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.170846 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.274200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.274662 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.274923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.275122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.275307 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.378972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.379033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.379049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.379074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.379093 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.481954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.482013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.482029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.482052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.482068 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.544876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.544915 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:49 crc kubenswrapper[4764]: E1203 23:41:49.545057 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.544876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:49 crc kubenswrapper[4764]: E1203 23:41:49.545678 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:49 crc kubenswrapper[4764]: E1203 23:41:49.545773 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.546341 4764 scope.go:117] "RemoveContainer" containerID="cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.585037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.585098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.585121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.585150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.585173 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.688224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.688286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.688308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.688340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.688362 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.791527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.791605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.791627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.791653 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.791667 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.871257 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/1.log" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.875012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.875544 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.894318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.894362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.894372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.894397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.894410 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.897962 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.921263 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.942084 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.961618 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.982154 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.997496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.997535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.997544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.997564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:49 crc kubenswrapper[4764]: I1203 23:41:49.997574 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:49Z","lastTransitionTime":"2025-12-03T23:41:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.007304 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.064896 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.079813 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.095798 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.099789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.099836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.099855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.099879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.099897 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.113041 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.124688 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.153345 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.167333 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.189149 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.202861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.202922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.202943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.202969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.202989 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.215834 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.230049 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.305280 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.305339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.305357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.305382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.305401 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.339957 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.350054 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.360986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.373576 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.389349 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.405431 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.407757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.407792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.407803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.407822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.407833 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.421974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.436409 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.450914 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.466134 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.484255 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.499986 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.510211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.510273 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.510297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.510326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.510348 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.532252 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.544917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:50 crc kubenswrapper[4764]: E1203 23:41:50.545187 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.552623 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.571117 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.590692 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612089 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612417 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.612435 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.630861 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.717236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.717295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.717319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.717347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.717373 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.820390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.820461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.820484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.820518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.820540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.924081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.924457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.924587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.924751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:50 crc kubenswrapper[4764]: I1203 23:41:50.924891 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:50Z","lastTransitionTime":"2025-12-03T23:41:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.027583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.027646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.027664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.027688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.027707 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.130529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.130877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.131178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.131636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.132097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.235211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.235249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.235260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.235277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.235289 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.337971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.338296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.338494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.338789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.338997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.442041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.442107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.442124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.442147 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.442164 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.545242 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.545642 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:51 crc kubenswrapper[4764]: E1203 23:41:51.545976 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.546005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.545827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.546094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.546151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.546182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.546202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: E1203 23:41:51.546976 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:51 crc kubenswrapper[4764]: E1203 23:41:51.547084 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.649896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.649960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.649977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.650002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.650018 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.752672 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.753027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.753152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.753331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.753461 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.859639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.859680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.859691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.859708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:51 crc kubenswrapper[4764]: I1203 23:41:51.859741 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:51Z","lastTransitionTime":"2025-12-03T23:41:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.400417 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/2.log" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401189 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.401948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/1.log" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.405277 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7" exitCode=1 Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.405320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.405355 4764 scope.go:117] "RemoveContainer" containerID="cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.406837 4764 scope.go:117] "RemoveContainer" containerID="529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7" Dec 03 23:41:52 crc kubenswrapper[4764]: E1203 23:41:52.407106 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.424994 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.440743 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.465844 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.492627 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.503603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.504267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.504328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.504355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.504375 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.510134 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.528274 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.544757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:52 crc kubenswrapper[4764]: E1203 23:41:52.544946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.549381 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.564314 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.579414 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.598122 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.606979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.607039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.607058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.607078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.607121 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.615994 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.635159 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.651645 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.670709 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.688814 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.706617 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.709907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.709957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.709973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.709995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.710009 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.726947 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:52Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.813954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.813999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.814010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.814027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.814052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.916453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.916507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.916519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.916535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:52 crc kubenswrapper[4764]: I1203 23:41:52.916547 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:52Z","lastTransitionTime":"2025-12-03T23:41:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.019978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.020016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.020095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.020110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.020119 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.123567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.123635 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.123652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.123670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.123684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.226962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.227023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.227040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.227067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.227086 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.329948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.330009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.330033 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.330060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.330078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.410889 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/2.log" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.433199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.433257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.433277 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.433299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.433317 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.535676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.535784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.535810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.535840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.535862 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.544853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.544886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.544970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:53 crc kubenswrapper[4764]: E1203 23:41:53.545134 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:53 crc kubenswrapper[4764]: E1203 23:41:53.545254 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:53 crc kubenswrapper[4764]: E1203 23:41:53.545356 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.638827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.638860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.638870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.638884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.638894 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.741244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.741314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.741336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.741367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.741390 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.844754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.844832 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.844868 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.844903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.844923 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.947477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.947540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.947559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.947584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.947603 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:53Z","lastTransitionTime":"2025-12-03T23:41:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:53 crc kubenswrapper[4764]: I1203 23:41:53.995379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:53 crc kubenswrapper[4764]: E1203 23:41:53.995617 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:53 crc kubenswrapper[4764]: E1203 23:41:53.995835 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:42:09.995765841 +0000 UTC m=+65.757090332 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.050648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.050781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.050802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.050826 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.050844 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.154012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.154082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.154105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.154133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.154154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.256890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.256957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.256975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.257001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.257019 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.360379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.360440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.360459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.360484 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.360502 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.463616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.463682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.463704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.463766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.463791 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.549529 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:54 crc kubenswrapper[4764]: E1203 23:41:54.549692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.570960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.571022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.571043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.571070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.571091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.571388 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.592959 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.619405 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.638682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.663034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.675613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.675650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.675665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.675685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.675699 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.678368 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.696047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.716483 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.733342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.747680 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.766030 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.778225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.778289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.778343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.778370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.778388 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.785943 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.806955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.826569 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.859040 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfb4063972059566e9164d1828a10a0e50add06bd5979cf676e2e6d550e01ff9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"message\\\":\\\"ervice/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722265 6173 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 23:41:35.722653 6173 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:41:35.722700 6173 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:41:35.722709 6173 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:41:35.722755 6173 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 23:41:35.722760 6173 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 23:41:35.722777 6173 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 23:41:35.722802 6173 factory.go:656] Stopping watch factory\\\\nI1203 23:41:35.722820 6173 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:41:35.722833 6173 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:41:35.722838 6173 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:41:35.722844 6173 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:41:35.722851 6173 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 23:41:35.722856 6173 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.874067 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.881308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.881359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.881374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.881396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.881412 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.890240 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:54Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.984151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.984202 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.984221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.984250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:54 crc kubenswrapper[4764]: I1203 23:41:54.984268 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:54Z","lastTransitionTime":"2025-12-03T23:41:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.087201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.087262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.087281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.087307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.087325 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.190101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.190178 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.190199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.190227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.190248 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.293170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.293232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.293250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.293274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.293292 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.310327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.310427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.310450 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.310544 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:42:27.310519886 +0000 UTC m=+83.071844327 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.310613 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.310697 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:42:27.3106693 +0000 UTC m=+83.071993751 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.397000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.397059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.397075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.397099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.397116 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.411711 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.411898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.411939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.411994 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:42:27.411956193 +0000 UTC m=+83.173280634 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412097 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412121 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412140 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412162 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412200 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412227 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412205 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:42:27.412188159 +0000 UTC m=+83.173512610 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.412319 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:42:27.412296122 +0000 UTC m=+83.173620563 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.500317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.500374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.500393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.500415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.500434 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.545314 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.545426 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.545314 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.545539 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.545692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:55 crc kubenswrapper[4764]: E1203 23:41:55.546040 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.603439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.603491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.603508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.603556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.603573 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.707304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.707690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.707876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.708064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.708193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.811369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.811431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.811449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.811473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.811490 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.914043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.914103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.914130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.914162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:55 crc kubenswrapper[4764]: I1203 23:41:55.914183 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:55Z","lastTransitionTime":"2025-12-03T23:41:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.017219 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.017305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.017332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.017364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.017384 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.120595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.120652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.120669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.120695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.120712 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.223334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.223691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.223879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.224022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.224165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.327417 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.327861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.328075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.328245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.328380 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.430895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.430966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.430985 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.431011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.431028 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.534267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.534339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.534362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.534392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.534415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.545019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:56 crc kubenswrapper[4764]: E1203 23:41:56.545248 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.637916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.638002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.638027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.638059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.638088 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.740485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.740548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.740566 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.740589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.740606 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.843898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.843944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.843956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.843974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.843986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.946556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.946611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.946627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.946647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:56 crc kubenswrapper[4764]: I1203 23:41:56.946663 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:56Z","lastTransitionTime":"2025-12-03T23:41:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.048585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.048665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.048683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.048742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.048765 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.071325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.071370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.071381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.071399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.071411 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.091103 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:57Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.096741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.096800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.096818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.096844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.096865 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.117267 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:57Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.122414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.122459 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.122476 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.122498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.122514 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.143404 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:57Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.148603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.148655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.148675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.148700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.148769 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.169022 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:57Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.173443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.173558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.173592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.173627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.173650 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.189225 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:57Z is after 2025-08-24T17:21:41Z" Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.189453 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.191391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.191438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.191458 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.191482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.191501 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.295160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.295240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.295275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.295309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.295329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.398094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.398177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.398205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.398236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.398260 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.501126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.501199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.501215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.501240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.501259 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.545220 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.545257 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.545228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.545393 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.545529 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:57 crc kubenswrapper[4764]: E1203 23:41:57.545645 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.604310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.604364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.604382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.604404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.604420 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.708287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.708359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.708379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.708404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.708422 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.811619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.811671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.811682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.811699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.811710 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.916461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.917308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.917322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.917337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:57 crc kubenswrapper[4764]: I1203 23:41:57.917346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:57Z","lastTransitionTime":"2025-12-03T23:41:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.020571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.020646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.020668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.020699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.020754 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.123910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.123995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.124023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.124057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.124079 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.227386 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.227486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.227505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.227563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.227583 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.330903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.330969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.330987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.331011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.331029 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.432621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.432653 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.432663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.432674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.432701 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.536164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.536224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.536241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.536266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.536285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.545814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:41:58 crc kubenswrapper[4764]: E1203 23:41:58.546006 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.639035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.639090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.639107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.639129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.639146 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.742455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.742530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.742555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.742590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.742611 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.846186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.846272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.846298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.846327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.846346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.949394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.949444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.949456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.949475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:58 crc kubenswrapper[4764]: I1203 23:41:58.949487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:58Z","lastTransitionTime":"2025-12-03T23:41:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.053040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.053108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.053125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.053150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.053168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.156276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.156376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.156415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.156451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.156471 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.260105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.260163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.260182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.260207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.260229 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.362646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.362699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.362742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.362767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.362785 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.465061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.465173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.465197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.465226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.465246 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.545278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.545320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:41:59 crc kubenswrapper[4764]: E1203 23:41:59.545473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.545594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:41:59 crc kubenswrapper[4764]: E1203 23:41:59.545758 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:41:59 crc kubenswrapper[4764]: E1203 23:41:59.545869 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.568374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.568414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.568431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.568451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.568467 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.671234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.671283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.671304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.671326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.671344 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.774149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.774210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.774227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.774251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.774270 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.877230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.877288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.877305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.877328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.877346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.980591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.980669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.980689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.980743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:41:59 crc kubenswrapper[4764]: I1203 23:41:59.980763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:41:59Z","lastTransitionTime":"2025-12-03T23:41:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.087968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.088032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.088049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.088073 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.088090 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.191192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.191250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.191274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.191298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.191315 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.294288 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.294330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.294342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.294358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.294369 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.397853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.397938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.397963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.397999 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.398024 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.500862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.500924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.500946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.500975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.500997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.544997 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:00 crc kubenswrapper[4764]: E1203 23:42:00.545191 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.603115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.603171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.603188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.603211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.603227 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.706854 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.706916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.706933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.706959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.706978 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.809925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.809963 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.809972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.809987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.809998 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.913150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.913232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.913256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.913289 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:00 crc kubenswrapper[4764]: I1203 23:42:00.913313 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:00Z","lastTransitionTime":"2025-12-03T23:42:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.016969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.017088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.017107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.017133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.017151 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.120343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.120412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.120434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.120467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.120488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.224034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.224103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.224120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.224148 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.224167 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.326357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.326400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.326416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.326431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.326440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.430238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.430320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.430340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.430363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.430382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.533347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.533410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.533427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.533454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.533474 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.545784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.545876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.545801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:01 crc kubenswrapper[4764]: E1203 23:42:01.545953 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:01 crc kubenswrapper[4764]: E1203 23:42:01.546128 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:01 crc kubenswrapper[4764]: E1203 23:42:01.546301 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.637540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.637603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.637622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.637654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.637673 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.740186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.740342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.740380 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.740409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.740431 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.842902 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.842961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.842977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.843000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.843016 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.945670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.945758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.945775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.945800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:01 crc kubenswrapper[4764]: I1203 23:42:01.945817 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:01Z","lastTransitionTime":"2025-12-03T23:42:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.048536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.048596 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.048619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.048644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.048661 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.151136 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.151212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.151234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.151264 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.151287 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.253763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.253845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.253869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.253894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.253912 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.357133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.357205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.357226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.357251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.357295 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.460079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.460180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.460205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.460240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.460267 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.545765 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:02 crc kubenswrapper[4764]: E1203 23:42:02.546011 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.562807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.563075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.563240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.563272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.563329 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.667358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.667426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.667445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.667470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.667489 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.771055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.771102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.771119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.771144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.771161 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.874089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.874146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.874163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.874185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.874202 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.977225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.977279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.977296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.977321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:02 crc kubenswrapper[4764]: I1203 23:42:02.977337 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:02Z","lastTransitionTime":"2025-12-03T23:42:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.080359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.080434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.080471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.080505 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.080527 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.182981 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.183042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.183065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.183092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.183116 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.286836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.286905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.286929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.286961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.286985 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.389770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.389842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.389866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.389899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.389923 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.493334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.493383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.493400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.493423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.493440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.545503 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.545560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.545614 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:03 crc kubenswrapper[4764]: E1203 23:42:03.545760 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:03 crc kubenswrapper[4764]: E1203 23:42:03.546580 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:03 crc kubenswrapper[4764]: E1203 23:42:03.546672 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.547134 4764 scope.go:117] "RemoveContainer" containerID="529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7" Dec 03 23:42:03 crc kubenswrapper[4764]: E1203 23:42:03.547545 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.559871 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.577430 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.591821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.595881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.595931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.595944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.595962 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.595974 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.606652 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.624464 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.636252 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.649807 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.669617 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.683863 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.695355 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.698181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.698228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.698242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.698266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.698283 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.708974 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.723245 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.739533 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.757022 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.795142 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.800933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.800975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.801007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.801024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.801039 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.818788 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.841047 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:03Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.904093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.904137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.904145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.904160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:03 crc kubenswrapper[4764]: I1203 23:42:03.904168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:03Z","lastTransitionTime":"2025-12-03T23:42:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.007803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.008177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.008317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.008478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.008618 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.111770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.111839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.111862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.111894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.111919 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.215211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.215554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.215698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.215889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.216040 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.319594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.319674 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.319694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.319762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.319796 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.422796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.422859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.422878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.422905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.422960 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.525769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.525810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.525821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.525846 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.525859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.544815 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:04 crc kubenswrapper[4764]: E1203 23:42:04.545029 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.564693 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.576565 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.595485 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.628061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.628118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.628137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.628163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.628180 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.636021 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.650649 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.672824 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.688046 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.705954 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.723420 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.730936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.731002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.731021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.731046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.731064 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.745970 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.764107 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.781667 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.801624 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.823922 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.833586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.833644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.833661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.833685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.833704 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.848220 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.869476 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.884649 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:04Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.936430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.936496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.936514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.936536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:04 crc kubenswrapper[4764]: I1203 23:42:04.936553 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:04Z","lastTransitionTime":"2025-12-03T23:42:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.039386 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.039447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.039464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.039489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.039509 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.142563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.142873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.142885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.142901 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.142911 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.245249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.245307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.245325 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.245350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.245367 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.348353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.348401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.348412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.348431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.348445 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.451501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.451562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.451579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.451608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.451627 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.545235 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.545286 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.545315 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:05 crc kubenswrapper[4764]: E1203 23:42:05.545430 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:05 crc kubenswrapper[4764]: E1203 23:42:05.545519 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:05 crc kubenswrapper[4764]: E1203 23:42:05.545676 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.553953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.553989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.554001 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.554024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.554040 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.656421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.656468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.656480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.656501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.656522 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.760096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.760146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.760164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.760190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.760207 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.862039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.862109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.862127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.862154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.862174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.964491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.964556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.964578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.964607 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:05 crc kubenswrapper[4764]: I1203 23:42:05.964629 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:05Z","lastTransitionTime":"2025-12-03T23:42:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.067353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.067414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.067431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.067455 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.067474 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.170482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.170534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.170550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.170573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.170589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.274278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.274350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.274373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.274404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.274426 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.376634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.376707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.376772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.376804 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.376828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.479249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.479300 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.479312 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.479329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.479341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.545260 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:06 crc kubenswrapper[4764]: E1203 23:42:06.545453 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.581653 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.581689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.581701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.581737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.581749 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.684966 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.685019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.685035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.685056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.685075 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.787964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.788032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.788055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.788086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.788109 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.891147 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.891211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.891233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.891267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.891289 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.994338 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.994429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.994453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.994478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:06 crc kubenswrapper[4764]: I1203 23:42:06.994497 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:06Z","lastTransitionTime":"2025-12-03T23:42:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.096879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.096922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.096935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.096953 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.096966 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.199307 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.199348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.199359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.199375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.199385 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.302695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.302765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.302780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.302799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.302815 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.405430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.405503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.405521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.405545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.405561 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.508022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.508093 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.508114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.508143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.508165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.544666 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.544827 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.545014 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.545108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.545151 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.545355 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.548489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.548554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.548576 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.548603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.548626 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.568159 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:07Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.572370 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.572418 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.572430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.572448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.572458 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.592547 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:07Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.597142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.597197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.597214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.597237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.597256 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.614535 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:07Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.618880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.618936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.618957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.618986 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.619011 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.635543 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:07Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.640027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.640089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.640113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.640142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.640161 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.660502 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:07Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:07 crc kubenswrapper[4764]: E1203 23:42:07.660760 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.662952 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.662998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.663014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.663036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.663054 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.766943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.766983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.766997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.767024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.767035 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.870025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.870087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.870103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.870126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.870143 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.972990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.973022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.973034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.973051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:07 crc kubenswrapper[4764]: I1203 23:42:07.973068 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:07Z","lastTransitionTime":"2025-12-03T23:42:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.076529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.076565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.076575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.076599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.076609 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.178345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.178383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.178393 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.178408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.178417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.281083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.281137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.281149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.281169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.281182 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.383998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.384059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.384079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.384101 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.384117 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.486197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.486238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.486250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.486267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.486277 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.545381 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:08 crc kubenswrapper[4764]: E1203 23:42:08.545525 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.588528 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.588572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.588584 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.588601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.588613 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.692132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.692197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.692220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.692252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.692276 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.794979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.795043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.795061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.795086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.795103 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.925162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.925210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.925227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.925249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:08 crc kubenswrapper[4764]: I1203 23:42:08.925265 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:08Z","lastTransitionTime":"2025-12-03T23:42:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.028074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.028125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.028142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.028165 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.028181 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.132098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.132169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.132187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.132636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.132693 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.235916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.235973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.235989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.236015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.236033 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.338818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.338918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.338941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.338974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.339001 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.442608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.442667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.442685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.442708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.442803 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.545091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.545098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:09 crc kubenswrapper[4764]: E1203 23:42:09.545387 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546537 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: E1203 23:42:09.546577 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.546648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:09 crc kubenswrapper[4764]: E1203 23:42:09.546861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.649555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.649650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.649682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.649704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.649743 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.752657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.752774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.752793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.753193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.753390 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.856336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.856434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.856453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.856490 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.856506 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.961123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.961189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.961213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.961758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:09 crc kubenswrapper[4764]: I1203 23:42:09.961831 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:09Z","lastTransitionTime":"2025-12-03T23:42:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.064758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.064799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.064811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.064828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.064839 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.077497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:10 crc kubenswrapper[4764]: E1203 23:42:10.077611 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:42:10 crc kubenswrapper[4764]: E1203 23:42:10.077669 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:42:42.077650942 +0000 UTC m=+97.838975353 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.167477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.167543 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.167561 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.167586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.167608 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.270177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.270222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.270234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.270250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.270262 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.373201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.373292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.373305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.373324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.373340 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.475634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.475685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.475696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.475731 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.475744 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.545449 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:10 crc kubenswrapper[4764]: E1203 23:42:10.545597 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.577743 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.577808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.577821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.577840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.577853 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.681084 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.681150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.681169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.681195 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.681215 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.785171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.785297 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.785327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.785359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.785384 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.888239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.888540 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.888565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.888590 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.888606 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.990792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.990837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.990849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.990866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:10 crc kubenswrapper[4764]: I1203 23:42:10.990875 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:10Z","lastTransitionTime":"2025-12-03T23:42:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.093877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.093916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.093926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.093943 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.093954 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.197293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.197349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.197366 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.197388 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.197407 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.299813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.299855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.299867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.299882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.299892 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.401973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.402038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.402056 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.402081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.402097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.478440 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/0.log" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.478505 4764 generic.go:334] "Generic (PLEG): container finished" podID="8789b456-ab23-4316-880d-5c02242cd3fd" containerID="b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc" exitCode=1 Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.478541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerDied","Data":"b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.479046 4764 scope.go:117] "RemoveContainer" containerID="b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505184 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505598 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.505654 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.523470 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.539022 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.545154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.545617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.545602 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:11 crc kubenswrapper[4764]: E1203 23:42:11.545938 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:11 crc kubenswrapper[4764]: E1203 23:42:11.546250 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:11 crc kubenswrapper[4764]: E1203 23:42:11.546383 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.557560 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.574640 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.589423 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619053 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619200 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619220 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.619232 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.632087 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.645386 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.654651 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.667291 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.680518 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.689048 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.697955 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.708315 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.718053 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.721929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.722047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.722125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.722210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.722287 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.728140 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:11Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.824799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.824848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.824857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.824873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.824882 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.927801 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.927840 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.927849 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.927864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:11 crc kubenswrapper[4764]: I1203 23:42:11.927874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:11Z","lastTransitionTime":"2025-12-03T23:42:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.030650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.030768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.030793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.030822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.030845 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.133142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.133181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.133192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.133207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.133216 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.239225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.242809 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.242905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.242939 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.242961 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.345531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.345564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.345577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.345592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.345604 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.448364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.448419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.448439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.448467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.448488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.483317 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/0.log" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.483356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerStarted","Data":"dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.494351 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.507705 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.525942 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.536479 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.544776 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:12 crc kubenswrapper[4764]: E1203 23:42:12.544932 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.550209 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.551216 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.551250 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.551266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.551286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.551302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.562854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.574951 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.585543 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.598627 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.611997 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.624786 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.634305 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.645774 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.653049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.653075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.653085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.653096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.653107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.660425 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.670143 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.680256 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.688493 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:12Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.755186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.755228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.755239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.755255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.755263 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.857339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.857379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.857390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.857405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.857416 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.959645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.959685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.959696 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.959736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:12 crc kubenswrapper[4764]: I1203 23:42:12.959748 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:12Z","lastTransitionTime":"2025-12-03T23:42:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.062397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.062486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.062555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.062588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.062612 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.164389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.164423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.164432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.164445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.164457 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.267234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.267269 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.267278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.267292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.267303 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.369401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.369457 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.369475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.369499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.369516 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.472295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.472357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.472379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.472410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.472431 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.545328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.545362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.545351 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:13 crc kubenswrapper[4764]: E1203 23:42:13.545507 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:13 crc kubenswrapper[4764]: E1203 23:42:13.545617 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:13 crc kubenswrapper[4764]: E1203 23:42:13.545683 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.574544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.574574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.574591 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.574611 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.574628 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.677556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.677599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.677615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.677637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.677656 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.783521 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.783592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.783620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.783778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.783851 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.886762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.886880 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.886908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.886938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.886959 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.989619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.989658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.989668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.989684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:13 crc kubenswrapper[4764]: I1203 23:42:13.989695 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:13Z","lastTransitionTime":"2025-12-03T23:42:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.092538 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.092594 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.092606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.092626 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.092639 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.195454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.195527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.195549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.195574 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.195591 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.298347 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.298489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.298508 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.298533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.298553 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.401926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.401967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.401975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.401989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.401998 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.504619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.504687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.504708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.504789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.504813 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.545532 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:14 crc kubenswrapper[4764]: E1203 23:42:14.546173 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.546204 4764 scope.go:117] "RemoveContainer" containerID="529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.566203 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.579672 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.593625 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.605546 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.610441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.610491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.610507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.610527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.610539 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.618285 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.633230 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.652504 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.669225 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.680635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.694810 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.712986 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.732470 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.746635 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.781679 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.802796 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.814071 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.815095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.815112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.815120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.815134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.815143 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.831253 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:14Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.919511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.919558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.919568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.919582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:14 crc kubenswrapper[4764]: I1203 23:42:14.919597 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:14Z","lastTransitionTime":"2025-12-03T23:42:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.021792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.021829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.021841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.021858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.021871 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.124039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.124071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.124083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.124097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.124108 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.226919 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.226971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.226984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.227003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.227017 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.329554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.329622 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.329642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.329668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.329687 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.432062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.432107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.432120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.432139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.432154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.496483 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/2.log" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.500196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.501116 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.523646 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.534658 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.534710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.534750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.534764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.534775 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.541131 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.545421 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.545513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:15 crc kubenswrapper[4764]: E1203 23:42:15.545538 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:15 crc kubenswrapper[4764]: E1203 23:42:15.545645 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.545697 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:15 crc kubenswrapper[4764]: E1203 23:42:15.545782 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.555302 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.568310 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.581061 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.593429 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.607290 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.615313 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.623437 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637062 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.637552 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.646950 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.655783 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.666363 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.676157 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.687004 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.696058 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.715434 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:15Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.739129 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.739168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.739185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.739207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.739226 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.841352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.841395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.841416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.841552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.841573 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.943941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.944007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.944032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.944060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:15 crc kubenswrapper[4764]: I1203 23:42:15.944083 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:15Z","lastTransitionTime":"2025-12-03T23:42:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.046930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.046979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.046990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.047005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.047016 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.149891 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.149950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.149967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.149991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.150007 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.252439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.252495 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.252518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.252549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.252573 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.354859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.354904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.354924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.354946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.354962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.458233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.458291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.458309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.458337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.458355 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.504984 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/3.log" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.505852 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/2.log" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.508559 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" exitCode=1 Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.508627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.508689 4764 scope.go:117] "RemoveContainer" containerID="529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.509711 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:42:16 crc kubenswrapper[4764]: E1203 23:42:16.510051 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.526067 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.543413 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.545938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:16 crc kubenswrapper[4764]: E1203 23:42:16.546164 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561559 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561509 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.561826 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.576962 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.589269 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.599957 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.611427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.630423 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.641589 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.650618 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.662616 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.664789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.664829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.664841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.664857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.664868 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.675815 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.688093 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.713257 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://529c4fab70a1c9f4e5be74962978088087c6dcf7be4ec64522369ed12b1905d7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"message\\\":\\\" x509: certificate has expired or is not yet valid: current time 2025-12-03T23:41:50Z is after 2025-08-24T17:21:41Z]\\\\nI1203 23:41:50.541463 6373 lb_config.go:1031] Cluster endpoints for openshift-kube-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 23:41:50.541629 6373 services_controller.go:443] Built service openshift-kube-apiserver-operator/metrics LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.5.109\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI1203 23:41:50.541642 6373 services_controller.go:444] Built service openshift-kube-apiserver-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541650 6373 services_controller.go:445] Built service openshift-kube-apiserver-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI1203 23:41:50.541666 6373 services_controller.go:451] Built service openshift-kube-apiserver-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", E\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:15Z\\\",\\\"message\\\":\\\"o:208] Removed *v1.Node event handler 7\\\\nI1203 23:42:15.361555 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:42:15.361564 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:42:15.361562 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:42:15.361583 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:42:15.361665 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361692 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:42:15.361709 6720 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:42:15.361728 6720 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:42:15.361740 6720 factory.go:656] Stopping watch factory\\\\nI1203 23:42:15.361755 6720 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:42:15.361771 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 23:42:15.361774 6720 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361779 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:42:15.361796 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:42:15.361813 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 23:42:15.361869 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.729362 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.745908 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.757177 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:16Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.767339 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.767397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.767415 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.767440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.767456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.870602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.870636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.870645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.870661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.870671 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.973137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.973189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.973205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.973228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:16 crc kubenswrapper[4764]: I1203 23:42:16.973246 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:16Z","lastTransitionTime":"2025-12-03T23:42:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.075451 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.075479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.075486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.075500 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.075509 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.178006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.178039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.178047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.178061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.178070 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.280707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.280764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.280778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.280794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.280805 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.383746 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.383803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.383820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.383842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.383859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.487106 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.487162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.487188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.487223 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.487243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.514981 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/3.log" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.521415 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:42:17 crc kubenswrapper[4764]: E1203 23:42:17.521781 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.542605 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.544938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:17 crc kubenswrapper[4764]: E1203 23:42:17.545471 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.545944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:17 crc kubenswrapper[4764]: E1203 23:42:17.546070 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.546163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:17 crc kubenswrapper[4764]: E1203 23:42:17.546404 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.562465 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.579325 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.589284 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.589978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.590014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.590046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.590062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.590071 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.602018 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.613821 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.627575 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.639491 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.656009 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.672979 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.687241 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.693666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.693750 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.693768 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.693794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.693811 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.701866 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.727276 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.766478 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.787025 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:15Z\\\",\\\"message\\\":\\\"o:208] Removed *v1.Node event handler 7\\\\nI1203 23:42:15.361555 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:42:15.361564 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:42:15.361562 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:42:15.361583 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:42:15.361665 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361692 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:42:15.361709 6720 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:42:15.361728 6720 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:42:15.361740 6720 factory.go:656] Stopping watch factory\\\\nI1203 23:42:15.361755 6720 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:42:15.361771 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 23:42:15.361774 6720 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361779 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:42:15.361796 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:42:15.361813 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 23:42:15.361869 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.796022 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.796045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.796055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.796067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.796077 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.800248 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.813959 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:17Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.898341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.898405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.898430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.898460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.898481 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.990211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.990234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.990243 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.990255 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:17 crc kubenswrapper[4764]: I1203 23:42:17.990265 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:17Z","lastTransitionTime":"2025-12-03T23:42:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.010951 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:18Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.016162 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.016225 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.016249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.016278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.016299 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.038288 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:18Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.043519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.043563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.043578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.043599 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.043617 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.063128 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:18Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.068124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.068215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.068233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.068257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.068274 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.085054 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:18Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.088483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.088545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.088585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.088623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.088647 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.109255 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:18Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.109457 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.111464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.111520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.111536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.111556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.111571 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.213988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.214040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.214058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.214088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.214108 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.317578 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.317667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.317694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.317762 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.317789 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.421212 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.421272 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.421287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.421309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.421322 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.523668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.523749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.523766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.523790 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.523807 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.545663 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:18 crc kubenswrapper[4764]: E1203 23:42:18.545995 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.626793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.626878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.626936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.626972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.626998 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.729746 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.729797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.729807 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.729823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.729832 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.833169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.833251 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.833276 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.833310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.833333 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.936479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.936549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.936571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.936601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:18 crc kubenswrapper[4764]: I1203 23:42:18.936623 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:18Z","lastTransitionTime":"2025-12-03T23:42:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.039456 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.039503 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.039514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.039531 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.039542 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.143031 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.143086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.143102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.143127 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.143144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.245698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.245792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.245811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.245867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.245888 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.348349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.348399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.348412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.348430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.348442 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.451160 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.451221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.451240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.451265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.451281 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.545574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.545648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.545575 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:19 crc kubenswrapper[4764]: E1203 23:42:19.545839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:19 crc kubenswrapper[4764]: E1203 23:42:19.545946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:19 crc kubenswrapper[4764]: E1203 23:42:19.546085 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.554646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.554688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.554707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.554754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.554774 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.658179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.658227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.658238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.658258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.658269 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.761002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.761042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.761054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.761071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.761082 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.863992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.864057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.864083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.864111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.864132 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.967445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.967502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.967519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.967545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:19 crc kubenswrapper[4764]: I1203 23:42:19.967566 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:19Z","lastTransitionTime":"2025-12-03T23:42:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.071306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.071361 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.071376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.071400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.071417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.174411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.174453 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.174462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.174479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.174489 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.277411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.277489 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.277512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.277545 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.277567 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.381077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.381138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.381156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.381180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.381197 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.484434 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.484802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.484972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.485126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.485302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.545508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:20 crc kubenswrapper[4764]: E1203 23:42:20.545699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.588010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.588067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.588083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.588140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.588159 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.690595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.690671 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.690695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.690764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.690801 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.793688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.794043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.794226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.794399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.794544 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.897802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.898184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.898357 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.898604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:20 crc kubenswrapper[4764]: I1203 23:42:20.898822 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:20Z","lastTransitionTime":"2025-12-03T23:42:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.001109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.001493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.001679 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.001874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.002020 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.104473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.104527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.104552 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.104580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.104603 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.207017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.207081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.207103 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.207132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.207154 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.310009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.310095 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.310120 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.310154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.310181 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.413373 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.413430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.413446 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.413470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.413487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.516713 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.516805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.516825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.516851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.516870 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.545273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.545380 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:21 crc kubenswrapper[4764]: E1203 23:42:21.545472 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.545502 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:21 crc kubenswrapper[4764]: E1203 23:42:21.545688 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:21 crc kubenswrapper[4764]: E1203 23:42:21.545869 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.621094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.621158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.621175 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.621198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.621216 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.724448 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.724507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.724527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.724550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.724565 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.826994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.827037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.827046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.827060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.827069 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.929917 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.929975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.929995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.930021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:21 crc kubenswrapper[4764]: I1203 23:42:21.930044 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:21Z","lastTransitionTime":"2025-12-03T23:42:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.033419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.033493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.033518 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.033550 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.033574 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.136865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.136929 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.136954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.136983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.137002 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.240757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.240818 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.240834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.240859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.240878 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.343841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.343941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.343959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.343984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.344001 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.447319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.447386 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.447403 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.447429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.447448 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.545017 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:22 crc kubenswrapper[4764]: E1203 23:42:22.545220 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.549078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.549121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.549137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.549157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.549177 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.651824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.651865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.651877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.651892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.651901 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.755074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.755143 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.755161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.755190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.755207 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.857838 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.857879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.857892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.857911 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.857926 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.961513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.961575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.961592 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.961614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:22 crc kubenswrapper[4764]: I1203 23:42:22.961634 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:22Z","lastTransitionTime":"2025-12-03T23:42:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.064436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.064587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.064612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.064641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.064661 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.167560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.167619 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.167638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.167665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.167684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.270360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.270408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.270421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.270440 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.270451 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.374062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.374126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.374145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.374172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.374191 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.478008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.478070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.478090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.478115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.478133 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.545435 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.545498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.545435 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:23 crc kubenswrapper[4764]: E1203 23:42:23.545807 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:23 crc kubenswrapper[4764]: E1203 23:42:23.545909 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:23 crc kubenswrapper[4764]: E1203 23:42:23.546059 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.581296 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.581364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.581387 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.581412 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.581429 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.684077 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.684149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.684168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.684198 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.684216 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.787253 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.787316 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.787334 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.787360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.787379 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.891153 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.891217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.891233 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.891258 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.891274 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.994072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.994118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.994132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.994151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:23 crc kubenswrapper[4764]: I1203 23:42:23.994162 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:23Z","lastTransitionTime":"2025-12-03T23:42:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.097180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.097256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.097275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.097305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.097323 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.200830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.200899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.200926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.200955 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.200977 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.304410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.304477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.304496 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.304524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.304540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.408340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.408439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.408848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.408918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.408941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.512133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.512240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.512262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.512285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.512341 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.544887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:24 crc kubenswrapper[4764]: E1203 23:42:24.545068 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.563420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.563954 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.585475 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.607695 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.614865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.614934 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.614959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.614992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.615016 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.623189 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.638405 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.664376 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.681092 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.696631 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.714479 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.717430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.717464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.717475 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.717493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.717505 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.734195 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.757342 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.792682 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:15Z\\\",\\\"message\\\":\\\"o:208] Removed *v1.Node event handler 7\\\\nI1203 23:42:15.361555 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:42:15.361564 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:42:15.361562 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:42:15.361583 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:42:15.361665 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361692 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:42:15.361709 6720 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:42:15.361728 6720 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:42:15.361740 6720 factory.go:656] Stopping watch factory\\\\nI1203 23:42:15.361755 6720 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:42:15.361771 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 23:42:15.361774 6720 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361779 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:42:15.361796 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:42:15.361813 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 23:42:15.361869 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.812162 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.821174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.821252 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.821278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.821315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.821339 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.837672 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.861439 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.876078 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.892884 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:24Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.925246 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.925292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.925305 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.925324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:24 crc kubenswrapper[4764]: I1203 23:42:24.925342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:24Z","lastTransitionTime":"2025-12-03T23:42:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.028602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.029017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.029030 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.029049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.029062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.131975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.132038 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.132064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.132094 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.132116 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.235039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.235341 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.235398 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.235422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.235440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.339049 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.339137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.339194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.339218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.339236 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.442651 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.442764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.442792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.442819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.442840 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545362 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545469 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:25 crc kubenswrapper[4764]: E1203 23:42:25.545667 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545886 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545907 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: E1203 23:42:25.545912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.545987 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: E1203 23:42:25.546016 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.648582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.648643 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.648663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.648690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.648709 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.751878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.751937 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.751957 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.751980 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.751997 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.855105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.855145 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.855163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.855186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.855203 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.958648 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.958791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.958819 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.958852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:25 crc kubenswrapper[4764]: I1203 23:42:25.958875 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:25Z","lastTransitionTime":"2025-12-03T23:42:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.062257 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.062308 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.062329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.062356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.062417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.166058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.166114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.166134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.166164 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.166188 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.269602 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.269661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.269685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.269783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.269809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.373050 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.373109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.373128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.373159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.373177 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.476315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.476354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.476364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.476379 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.476388 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.545166 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:26 crc kubenswrapper[4764]: E1203 23:42:26.545413 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.565532 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.579930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.579988 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.580006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.580035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.580053 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.683522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.683582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.683604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.683630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.683648 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.787422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.787498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.787522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.787553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.787577 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.891234 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.891303 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.891323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.891349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.891366 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.994857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.994906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.994923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.994947 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:26 crc kubenswrapper[4764]: I1203 23:42:26.994964 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:26Z","lastTransitionTime":"2025-12-03T23:42:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.098754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.098813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.098833 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.098857 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.098874 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.202405 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.202471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.202488 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.202512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.202531 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.305617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.305660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.305676 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.305699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.305747 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.365033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.365136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.365276 4764 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.365296 4764 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.365367 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:43:31.365338649 +0000 UTC m=+147.126663100 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.365400 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 23:43:31.36538443 +0000 UTC m=+147.126708881 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.408583 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.408628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.408644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.408666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.408684 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.466539 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.466700 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:31.466666395 +0000 UTC m=+147.227990856 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.466795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.466863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467072 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467103 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467109 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467135 4764 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467143 4764 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467156 4764 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467235 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 23:43:31.46721059 +0000 UTC m=+147.228535031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.467277 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 23:43:31.467256092 +0000 UTC m=+147.228580543 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.511893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.511948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.511971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.511997 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.512021 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.544867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.544921 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.544888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.545053 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.545230 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:27 crc kubenswrapper[4764]: E1203 23:42:27.545366 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.614568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.614609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.614624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.614644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.614662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.717477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.717549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.717571 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.717765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.717797 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.820624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.820657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.820667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.820682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.820692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.924126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.924186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.924206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.924231 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:27 crc kubenswrapper[4764]: I1203 23:42:27.924248 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:27Z","lastTransitionTime":"2025-12-03T23:42:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.027187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.027263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.027287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.027317 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.027340 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.129791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.129869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.129890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.129914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.129931 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.233108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.233171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.233188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.233213 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.233230 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.336238 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.336286 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.336298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.336315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.336327 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.349188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.349260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.349284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.349313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.349338 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.371680 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.376987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.377032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.377042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.377061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.377072 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.392677 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.397889 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.398013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.398034 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.398057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.398077 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.419014 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.423154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.423194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.423210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.423230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.423246 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.437621 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.442587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.442644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.442665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.442690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.442710 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.459138 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:28Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.459283 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.461150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.461180 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.461190 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.461206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.461280 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.545168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.545605 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.545964 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:42:28 crc kubenswrapper[4764]: E1203 23:42:28.546221 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.565116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.565189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.565210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.565235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.565263 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.668973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.669010 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.669019 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.669039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.669049 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.772249 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.772315 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.772324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.772344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.772357 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.875062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.875166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.875191 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.875221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.875242 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.978028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.978110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.978135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.978168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:28 crc kubenswrapper[4764]: I1203 23:42:28.978193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:28Z","lastTransitionTime":"2025-12-03T23:42:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.080989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.081021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.081032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.081047 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.081058 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.183766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.183825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.183842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.183866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.183883 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.287156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.287218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.287239 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.287265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.287286 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.390687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.390788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.390814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.390847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.390867 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.494777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.494820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.494836 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.494859 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.494875 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.545459 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.545461 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:29 crc kubenswrapper[4764]: E1203 23:42:29.545805 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.545491 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:29 crc kubenswrapper[4764]: E1203 23:42:29.545958 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:29 crc kubenswrapper[4764]: E1203 23:42:29.546160 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.598235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.598310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.598332 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.598363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.598386 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.703236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.703327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.703350 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.703382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.703404 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.806299 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.806342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.806353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.806371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.806382 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.908875 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.908961 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.908982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.909011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:29 crc kubenswrapper[4764]: I1203 23:42:29.909028 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:29Z","lastTransitionTime":"2025-12-03T23:42:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.012529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.012603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.012616 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.012642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.012656 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.115655 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.115746 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.115765 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.115792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.115814 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.219330 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.219390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.219413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.219444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.219466 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.322779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.322837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.322858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.322881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.322898 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.425702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.425775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.425787 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.425802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.425813 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.528640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.528695 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.528708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.528748 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.528763 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.545338 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:30 crc kubenswrapper[4764]: E1203 23:42:30.545550 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.632122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.632159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.632170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.632186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.632199 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.735830 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.735979 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.736000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.736024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.736044 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.839294 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.839369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.839392 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.839419 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.839437 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.943783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.943858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.943877 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.943905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:30 crc kubenswrapper[4764]: I1203 23:42:30.943932 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:30Z","lastTransitionTime":"2025-12-03T23:42:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.047158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.047224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.047293 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.047320 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.047336 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.151066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.151117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.151134 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.151158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.151174 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.254680 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.254778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.254805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.254834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.254856 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.359122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.359240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.359263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.359290 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.359312 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.463567 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.463681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.463699 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.463754 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.463774 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.545010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.545062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.545010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:31 crc kubenswrapper[4764]: E1203 23:42:31.545230 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:31 crc kubenswrapper[4764]: E1203 23:42:31.545370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:31 crc kubenswrapper[4764]: E1203 23:42:31.545473 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.569822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.569874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.569895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.569924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.569946 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.672172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.672237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.672260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.672281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.672297 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.774967 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.775021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.775037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.775057 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.775074 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.878514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.878589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.878613 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.878641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.878662 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.982565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.982661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.982688 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.982756 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:31 crc kubenswrapper[4764]: I1203 23:42:31.982784 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:31Z","lastTransitionTime":"2025-12-03T23:42:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.086265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.086321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.086342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.086389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.086417 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.190383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.190454 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.190480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.190512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.190534 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.293825 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.293893 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.293931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.293971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.293994 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.397631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.397707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.397771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.397800 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.397824 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.500423 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.500480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.500498 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.500523 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.500540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.544956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:32 crc kubenswrapper[4764]: E1203 23:42:32.545170 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.603274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.603337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.603355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.603377 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.603397 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.708110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.708167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.708183 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.708207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.708224 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.812014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.812114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.812161 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.812185 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.812201 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.915472 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.915529 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.915546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.915570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:32 crc kubenswrapper[4764]: I1203 23:42:32.915589 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:32Z","lastTransitionTime":"2025-12-03T23:42:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.019685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.019796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.019828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.019863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.019882 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.123209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.123283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.123306 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.123335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.123359 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.226519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.226589 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.226610 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.226632 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.226644 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.330048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.330097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.330116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.330139 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.330155 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.433088 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.433156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.433173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.433199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.433216 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.536525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.536577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.536595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.536617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.536634 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.545367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.545383 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.545364 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:33 crc kubenswrapper[4764]: E1203 23:42:33.545526 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:33 crc kubenswrapper[4764]: E1203 23:42:33.545692 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:33 crc kubenswrapper[4764]: E1203 23:42:33.545859 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.639058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.639107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.639124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.639146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.639163 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.742789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.742845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.742862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.742885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.742902 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.849439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.849512 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.849547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.849617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.849638 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.953597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.953667 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.953710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.953788 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:33 crc kubenswrapper[4764]: I1203 23:42:33.953815 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:33Z","lastTransitionTime":"2025-12-03T23:42:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.057871 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.057968 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.058017 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.058043 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.058061 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.161344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.161407 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.161431 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.161464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.161489 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.265464 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.265563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.265586 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.265617 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.265639 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.369024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.369117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.369146 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.369182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.369245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.472354 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.472494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.472515 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.472618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.472649 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.545854 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:34 crc kubenswrapper[4764]: E1203 23:42:34.546278 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.568469 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.575858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.575914 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.575930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.575948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.575961 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.601860 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:15Z\\\",\\\"message\\\":\\\"o:208] Removed *v1.Node event handler 7\\\\nI1203 23:42:15.361555 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:42:15.361564 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:42:15.361562 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:42:15.361583 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:42:15.361665 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361692 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:42:15.361709 6720 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:42:15.361728 6720 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:42:15.361740 6720 factory.go:656] Stopping watch factory\\\\nI1203 23:42:15.361755 6720 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:42:15.361771 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 23:42:15.361774 6720 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361779 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:42:15.361796 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:42:15.361813 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 23:42:15.361869 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.622447 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.643034 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.665271 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.679313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.679367 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.679383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.679408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.679428 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.688036 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65da185-54b3-4571-8bb7-b3a0c870066e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1b58481c4ab461b489b16f6d59f6387aa77ef8e970cd733acff0dd7f6e485f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64ea52f1c788768f5642c3a3e8a9dc1982f28b808e2f01b64623f0ab9930d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07b3af8030b087d047498877c1f4b294fd0eeaac9ee25a3e6153a0d565d934a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be35b25e472773c9ed1239d4805d0a6dd46a7681a3b0ab830a052bcfa147b21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://041271798de9eb2f2ab9014d24b9fea9364f5347713daafb99ac2163438beeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.708038 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.726207 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.744611 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.760290 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.776005 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.781301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.781327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.781337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.781352 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.781362 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.789529 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.807430 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.828600 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.846674 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.866630 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.881386 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4db6a96-06b6-4527-b6b8-2dae5ad38afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dae4bdca42055f4db895a31b2f091f4daa189af3019cc5dc0065244f64220792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.884843 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.884898 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.884916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.884973 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.884993 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.901169 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.918152 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:34Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.987992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.988062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.988082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.988109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:34 crc kubenswrapper[4764]: I1203 23:42:34.988129 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:34Z","lastTransitionTime":"2025-12-03T23:42:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.090916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.091000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.091014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.091032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.091097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.194080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.194132 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.194149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.194171 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.194312 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.296803 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.296873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.296897 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.296926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.296947 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.400003 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.400065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.400086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.400114 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.400132 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.504987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.505065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.505089 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.505119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.505142 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.545635 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.545682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.545641 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:35 crc kubenswrapper[4764]: E1203 23:42:35.545864 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:35 crc kubenswrapper[4764]: E1203 23:42:35.545950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:35 crc kubenswrapper[4764]: E1203 23:42:35.546045 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.607910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.608172 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.608362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.608520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.608901 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.712313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.712360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.712376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.712397 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.712415 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.816237 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.816314 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.816342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.816371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.816388 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.919549 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.919614 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.919631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.919657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:35 crc kubenswrapper[4764]: I1203 23:42:35.919675 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:35Z","lastTransitionTime":"2025-12-03T23:42:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.023282 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.023353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.023376 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.023401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.023420 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.127029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.127060 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.127068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.127081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.127091 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.229805 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.229895 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.229912 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.229936 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.229955 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.333117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.333182 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.333206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.333235 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.333256 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.435654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.435747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.435766 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.435789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.435807 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.538705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.538797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.538815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.538842 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.538859 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.545195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:36 crc kubenswrapper[4764]: E1203 23:42:36.545694 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.641595 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.641650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.641706 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.641769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.641787 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.744976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.745051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.745062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.745079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.745117 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.847736 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.847774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.847785 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.847799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.847809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.950121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.950156 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.950169 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.950186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:36 crc kubenswrapper[4764]: I1203 23:42:36.950198 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:36Z","lastTransitionTime":"2025-12-03T23:42:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.053283 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.053349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.053372 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.053402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.053426 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.156647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.156700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.156770 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.156813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.156831 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.260692 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.260763 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.260779 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.260802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.260814 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.366690 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.367408 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.367432 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.367492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.367674 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.471522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.471582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.471606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.471636 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.471657 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.544825 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.544830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.544830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:37 crc kubenswrapper[4764]: E1203 23:42:37.545439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:37 crc kubenswrapper[4764]: E1203 23:42:37.545516 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:37 crc kubenswrapper[4764]: E1203 23:42:37.545566 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.574005 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.574064 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.574081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.574105 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.574122 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.677051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.677112 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.677128 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.677152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.677168 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.779585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.779631 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.779642 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.779659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.779672 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.882324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.882396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.882416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.882437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.882451 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.985708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.986102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.986240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.986390 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:37 crc kubenswrapper[4764]: I1203 23:42:37.986540 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:37Z","lastTransitionTime":"2025-12-03T23:42:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.091971 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.092062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.092087 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.092150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.092176 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.195774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.195896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.195922 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.195954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.195978 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.299764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.299855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.299881 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.299916 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.299940 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.402548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.402987 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.403207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.403414 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.403622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.506462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.506517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.506537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.506605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.506628 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.545353 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.545527 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.596242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.596309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.596322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.596340 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.596352 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.615448 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.620928 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.620995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.621018 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.621050 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.621070 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.643350 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.649395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.649478 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.649502 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.649536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.649559 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.670962 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.676845 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.676906 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.676924 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.676950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.676968 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.697189 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.703097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.703152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.703170 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.703194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.703214 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.723799 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:38Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:38 crc kubenswrapper[4764]: E1203 23:42:38.724404 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.732950 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.733011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.733029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.733054 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.733071 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.837188 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.837265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.837298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.837328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.837347 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.940014 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.940082 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.940099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.940126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:38 crc kubenswrapper[4764]: I1203 23:42:38.940144 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:38Z","lastTransitionTime":"2025-12-03T23:42:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.043381 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.043469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.043493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.043524 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.043549 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.147678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.147802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.147888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.147918 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.147935 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.251186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.251260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.251281 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.251310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.251332 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.355621 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.356016 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.356035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.356058 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.356078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.458647 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.458700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.458741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.458764 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.458780 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.544962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.545044 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.545332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:39 crc kubenswrapper[4764]: E1203 23:42:39.545300 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:39 crc kubenswrapper[4764]: E1203 23:42:39.545665 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:39 crc kubenswrapper[4764]: E1203 23:42:39.546461 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.547083 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:42:39 crc kubenswrapper[4764]: E1203 23:42:39.547342 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.562144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.562197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.562214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.562236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.562258 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.665469 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.665535 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.665558 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.665587 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.665609 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.768909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.768974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.768991 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.769015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.769032 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.872268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.872343 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.872364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.872389 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.872405 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.975443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.975507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.975546 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.975582 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:39 crc kubenswrapper[4764]: I1203 23:42:39.975606 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:39Z","lastTransitionTime":"2025-12-03T23:42:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.078708 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.078784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.078799 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.078817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.078829 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.181668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.181737 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.181749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.181767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.181778 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.284678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.284777 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.284798 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.284822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.284842 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.387866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.387915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.387926 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.387942 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.387954 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.490385 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.490430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.490443 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.490461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.490473 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.545438 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:40 crc kubenswrapper[4764]: E1203 23:42:40.545654 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.593091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.593131 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.593140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.593157 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.593170 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.696678 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.696812 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.696839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.696864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.696913 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.800604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.800663 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.800679 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.800702 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.800761 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.904890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.904976 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.905000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.905027 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:40 crc kubenswrapper[4764]: I1203 23:42:40.905044 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:40Z","lastTransitionTime":"2025-12-03T23:42:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.008479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.008532 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.008544 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.008560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.008574 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.112822 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.112891 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.112908 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.112932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.112951 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.215992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.216061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.216085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.216116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.216138 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.320025 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.320081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.320098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.320121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.320139 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.423938 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.424015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.424041 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.424072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.424095 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.527141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.527189 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.527205 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.527228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.527245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.546003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.546056 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.546056 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:41 crc kubenswrapper[4764]: E1203 23:42:41.546225 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:41 crc kubenswrapper[4764]: E1203 23:42:41.546417 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:41 crc kubenswrapper[4764]: E1203 23:42:41.546500 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.630609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.630681 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.630704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.630783 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.630810 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.734013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.734074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.734096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.734124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.734146 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.837483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.837553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.837575 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.837605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.837628 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.940771 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.940847 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.940870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.940896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:41 crc kubenswrapper[4764]: I1203 23:42:41.940912 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:41Z","lastTransitionTime":"2025-12-03T23:42:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.044362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.044428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.044460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.044499 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.044522 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.132885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:42 crc kubenswrapper[4764]: E1203 23:42:42.133093 4764 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:42:42 crc kubenswrapper[4764]: E1203 23:42:42.133213 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs podName:acd1bf47-f475-47f3-95a7-2e0cecec15aa nodeName:}" failed. No retries permitted until 2025-12-03 23:43:46.133182229 +0000 UTC m=+161.894506680 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs") pod "network-metrics-daemon-9fkg4" (UID: "acd1bf47-f475-47f3-95a7-2e0cecec15aa") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.148144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.148207 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.148224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.148247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.148265 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.251410 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.251537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.251568 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.251638 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.251666 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.355015 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.355099 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.355119 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.355149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.355210 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.458324 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.458382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.458400 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.458428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.458451 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.545230 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:42 crc kubenswrapper[4764]: E1203 23:42:42.545422 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.561302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.561371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.561394 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.561420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.561440 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.664383 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.664441 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.664460 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.664486 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.664504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.766990 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.767046 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.767066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.767092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.767110 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.870707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.870816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.870841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.870870 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.870892 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.973941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.973994 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.974092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.974121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:42 crc kubenswrapper[4764]: I1203 23:42:42.974137 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:42Z","lastTransitionTime":"2025-12-03T23:42:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.076860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.076948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.076978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.077004 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.077020 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.180565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.180664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.180691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.180747 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.180766 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.283637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.283704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.283772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.283797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.283818 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.386657 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.386823 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.386851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.386883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.386906 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.490242 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.490309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.490326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.490355 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.490381 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.544703 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.544824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.544836 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:43 crc kubenswrapper[4764]: E1203 23:42:43.544945 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:43 crc kubenswrapper[4764]: E1203 23:42:43.545050 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:43 crc kubenswrapper[4764]: E1203 23:42:43.545165 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.593960 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.594028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.594045 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.594075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.594094 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.697083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.697154 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.697177 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.697206 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.697228 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.800409 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.800493 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.800522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.800556 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.800578 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.903029 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.903098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.903115 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.903135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:43 crc kubenswrapper[4764]: I1203 23:42:43.903193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:43Z","lastTransitionTime":"2025-12-03T23:42:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.006433 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.006526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.006551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.006580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.006604 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.110471 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.110547 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.110573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.110603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.110625 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.214364 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.214436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.214458 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.214482 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.214499 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.317837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.317913 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.317933 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.317959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.317977 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.420784 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.420851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.420869 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.420894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.420914 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.524507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.524573 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.524629 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.524661 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.524685 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.545906 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:44 crc kubenswrapper[4764]: E1203 23:42:44.546218 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.571854 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bff4c6e5-5fff-40e5-babc-02077d45f75a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.588208 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8aa821c6-8e16-4349-bc1e-deb4018489bd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3b2251cfeb26c132855b43c1d188e9f67bc1795ca93f5755a4b657782d477897\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa08564b6f208a44bb6ba0eb0882834671892bbcfbe4239c4322da7d8c94da5b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88055e89a86146d92dc184484ca147f0796cc67f048c0dbe53d4119d5df62138\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.609667 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d6c4788a4332636263f4204669e71f81b1cd1994abcfcbf0b4f37586c39060e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-79xnj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hpltl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.629470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.629520 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.629537 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.629564 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.629583 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.637443 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"86cae340-1f4f-4e51-a68d-fccd8b8f434a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://914920f2c0d3fec8435285f0f2d6525b0346c07017c27f8a15248f455c269511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a1edd082507571fa8ea00af8b0036d75ca659edc415f1f0ad1563837e2b439a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52d191bb881eb7fcfc27e9abcb2a281fc0b0925dc0553b3efb4030d237438db9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0bf9c603d46dc155a450a8541f143bf4edc2542b1a6c9b79f56d7555a714bb22\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5f31f82adf23e593de0dc476d5d5644f8b7d3c08a48c0801374beffa50cb42f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cecebaa2c037a8c67545617a4a07eff686edcdcc2ff36e9b12e370f90021475\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16aa279e775fe42ba85484d27cfe57dce7d20c3751bfbff5b69c7fd8018ad7bf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gqxfd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-xc6rn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.653248 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bbbnd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90634371-98e6-4e35-9f0c-06da331c8b04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://019ebe9a1d26131121d73ea12b9c34d73bf0f756a54f5e3c63629d954c98aab9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v298c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bbbnd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.673778 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40140be4-0168-4866-a807-92f9f0d89a32\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc7d499766d774eb89eb479b830f7b995c457daae5de0f469cc6aaeb17498f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613ad484dbb0ec49c8a231dd3e2a340506aaf3cec2de2171ff23a542daf515b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqbfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-94tkt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.687909 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4db6a96-06b6-4527-b6b8-2dae5ad38afe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dae4bdca42055f4db895a31b2f091f4daa189af3019cc5dc0065244f64220792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a02d507c3e052b0f28591fb8ff526916ff9edb288dcf8102fd230759f569a56c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.707240 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d574396f3feeee9c4577bee84081e2c07fa5fe304e6e922b5d468dacdfac68b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.724019 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:26Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9abf3a9133a632bee5867dc6191196801f496560a38ae0527471191f2a73f511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.732123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.732196 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.732210 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.732228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.732262 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.738797 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-ck5p6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9b1a8b4-35ef-4cdb-9baa-28dd562fb8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6c557443cbe7b676e3a83188c8f6ea2c6b481192440baa9bb89b44f048e9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sbnrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-ck5p6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.758329 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xj964" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8789b456-ab23-4316-880d-5c02242cd3fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:11Z\\\",\\\"message\\\":\\\"2025-12-03T23:41:26+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a\\\\n2025-12-03T23:41:26+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_e7eac4af-b2a5-45db-83d1-a8700bb36b6a to /host/opt/cni/bin/\\\\n2025-12-03T23:41:26Z [verbose] multus-daemon started\\\\n2025-12-03T23:41:26Z [verbose] Readiness Indicator file check\\\\n2025-12-03T23:42:11Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:42:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzdm8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xj964\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.773214 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"acd1bf47-f475-47f3-95a7-2e0cecec15aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gfvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:38Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-9fkg4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.790427 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"25226c62-0d42-4d79-9830-adb627038120\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eae0394d91ef5f99fb5944deef6fac63e956bfec3660df98bfbfcf547f28d466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://67f5d19a6c1fb3926bb46a50feb9e0cc3657a32edd24ed0f4fc071f6781bb511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcb66e4c3bf3d21bc85735a4fe7cbb466d7709ac5f5fd0bb0c74636ce20c3794\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8f2f33777fa4ddede8f5b8bdd838a5a67c2565c2d3191728abbb4302cfc5c27\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.808111 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.828957 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.835032 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.835485 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.835704 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.835927 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.836078 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.861847 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T23:42:15Z\\\",\\\"message\\\":\\\"o:208] Removed *v1.Node event handler 7\\\\nI1203 23:42:15.361555 6720 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 23:42:15.361564 6720 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 23:42:15.361562 6720 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 23:42:15.361583 6720 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 23:42:15.361665 6720 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361692 6720 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 23:42:15.361709 6720 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 23:42:15.361728 6720 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 23:42:15.361740 6720 factory.go:656] Stopping watch factory\\\\nI1203 23:42:15.361755 6720 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 23:42:15.361771 6720 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1203 23:42:15.361774 6720 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 23:42:15.361779 6720 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 23:42:15.361796 6720 ovnkube.go:599] Stopped ovnkube\\\\nI1203 23:42:15.361813 6720 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1203 23:42:15.361869 6720 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T23:42:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hjntt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jc5ck\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.887762 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e65da185-54b3-4571-8bb7-b3a0c870066e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c1b58481c4ab461b489b16f6d59f6387aa77ef8e970cd733acff0dd7f6e485f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f64ea52f1c788768f5642c3a3e8a9dc1982f28b808e2f01b64623f0ab9930d42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f07b3af8030b087d047498877c1f4b294fd0eeaac9ee25a3e6153a0d565d934a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be35b25e472773c9ed1239d4805d0a6dd46a7681a3b0ab830a052bcfa147b21c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://041271798de9eb2f2ab9014d24b9fea9364f5347713daafb99ac2163438beeff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da77d09f44056600e3a1577d4df7874c2bad6840053e51bc756ba2afbc964f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b420723787bee5ebd89e193889c9cf6d18b00596e4267e3dddf294840b448eb8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:05Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b8ea24d297942b7727803442c9e46e572aa631249160b6f4bc691e7e9fcf44d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T23:41:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T23:41:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T23:41:04Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.905816 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:23Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.923880 4764 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T23:41:24Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0efdd4765ad4043e1e4c17e89e894a006fff8f7a2c9c05fbdd0b0a7a3a24de0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a16569a77f3f1724b031a442e8f5925499d699f3060c46333e863f06b9868b2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T23:41:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:44Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.938608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.938646 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.938659 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.938677 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:44 crc kubenswrapper[4764]: I1203 23:42:44.938690 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:44Z","lastTransitionTime":"2025-12-03T23:42:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.041701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.041810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.041839 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.041863 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.041881 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.144479 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.144519 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.144530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.144548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.144560 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.247345 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.247402 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.247420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.247442 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.247459 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.350071 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.350107 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.350117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.350130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.350138 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.453517 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.453606 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.453687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.453786 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.453816 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.545270 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.545334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.545339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:45 crc kubenswrapper[4764]: E1203 23:42:45.545450 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:45 crc kubenswrapper[4764]: E1203 23:42:45.545655 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:45 crc kubenswrapper[4764]: E1203 23:42:45.546247 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.557375 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.557422 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.557439 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.557523 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.557545 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.661008 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.661074 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.661091 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.661116 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.661133 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.763810 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.763920 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.763944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.763972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.763991 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.866780 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.866835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.866848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.866866 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.866878 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.970522 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.970579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.970604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.970637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:45 crc kubenswrapper[4764]: I1203 23:42:45.970656 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:45Z","lastTransitionTime":"2025-12-03T23:42:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.073263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.073322 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.073344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.073374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.073398 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.176501 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.176562 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.176581 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.176605 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.176622 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.279977 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.280052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.280075 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.280104 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.280124 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.383284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.383319 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.383328 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.383344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.383353 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.486024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.486081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.486090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.486110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.486122 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.545708 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:46 crc kubenswrapper[4764]: E1203 23:42:46.546097 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.589226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.589266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.589284 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.589309 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.589330 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.692009 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.692080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.692098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.692123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.692139 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.795666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.795797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.795824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.795855 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.795877 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.898882 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.898951 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.898974 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.899026 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:46 crc kubenswrapper[4764]: I1203 23:42:46.899050 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:46Z","lastTransitionTime":"2025-12-03T23:42:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.002685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.002813 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.002837 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.002867 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.002890 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.105427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.105467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.105477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.105494 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.105504 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.208221 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.208295 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.208321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.208348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.208367 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.310628 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.310666 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.310675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.310691 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.310700 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.413758 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.413831 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.413848 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.413873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.413895 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.517173 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.517240 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.517262 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.517292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.517316 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.544777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.544854 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.544882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:47 crc kubenswrapper[4764]: E1203 23:42:47.544955 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:47 crc kubenswrapper[4764]: E1203 23:42:47.545165 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:47 crc kubenswrapper[4764]: E1203 23:42:47.545363 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.620285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.620349 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.620369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.620395 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.620416 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.723123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.723199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.723226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.723260 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.723284 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.826533 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.826620 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.826637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.826660 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.826703 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.929333 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.929401 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.929428 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.929461 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:47 crc kubenswrapper[4764]: I1203 23:42:47.929487 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:47Z","lastTransitionTime":"2025-12-03T23:42:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.032374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.032444 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.032462 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.032527 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.032545 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.135072 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.135149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.135167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.135194 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.135211 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.238078 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.238125 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.238141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.238166 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.238183 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.341884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.341948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.341965 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.341996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.342014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.444468 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.444534 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.444555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.444580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.444598 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.545003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:48 crc kubenswrapper[4764]: E1203 23:42:48.545122 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.546948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.547002 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.547021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.547044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.547062 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.649687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.649797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.649815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.649841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.649862 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.752858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.752935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.752959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.752989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.753015 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.856363 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.856430 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.856447 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.856473 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.856490 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.959824 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.959892 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.959915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.959948 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:48 crc kubenswrapper[4764]: I1203 23:42:48.959970 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:48Z","lastTransitionTime":"2025-12-03T23:42:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.063983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.064066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.064092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.064123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.064143 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.114554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.114618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.114637 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.114664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.114681 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.137208 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.143023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.143086 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.143108 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.143140 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.143161 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.163876 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.170055 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.170149 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.170176 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.170218 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.170245 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.191417 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.197092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.197224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.197291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.197327 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.197353 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.216244 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.222548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.222593 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.222604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.222623 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.222641 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.250588 4764 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T23:42:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"7b7d1078-78f6-4cc3-a0d3-6cc465c742cf\\\",\\\"systemUUID\\\":\\\"1697e568-5f3e-4ea7-a9c8-fd5696181e3f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T23:42:49Z is after 2025-08-24T17:21:41Z" Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.250777 4764 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.252956 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.252993 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.253007 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.253024 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.253036 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.356641 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.356698 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.356742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.356767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.356785 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.460775 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.460834 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.460852 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.460878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.460896 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.544956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.544987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.545180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.545516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.545649 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:49 crc kubenswrapper[4764]: E1203 23:42:49.545945 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.564279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.564321 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.564337 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.564360 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.564378 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.666326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.666369 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.666382 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.666399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.666413 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.769186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.769268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.769287 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.769313 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.769330 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.872890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.872982 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.873000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.873028 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.873046 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.976059 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.976118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.976135 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.976159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:49 crc kubenswrapper[4764]: I1203 23:42:49.976176 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:49Z","lastTransitionTime":"2025-12-03T23:42:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.079923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.080012 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.080037 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.080070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.080090 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.182945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.183013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.183036 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.183085 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.183107 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.286563 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.286640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.286665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.286694 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.286744 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.389639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.389710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.389791 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.389821 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.389844 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.492792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.492853 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.492878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.492909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.492930 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.545681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:50 crc kubenswrapper[4764]: E1203 23:42:50.546226 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.596302 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.596426 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.596445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.596470 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.596488 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.700209 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.700279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.700298 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.700326 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.700346 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.803445 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.803664 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.803682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.803742 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.803761 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.907098 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.907159 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.907186 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.907217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:50 crc kubenswrapper[4764]: I1203 23:42:50.907243 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:50Z","lastTransitionTime":"2025-12-03T23:42:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.010163 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.010224 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.010241 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.010266 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.010285 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.112808 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.112862 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.112879 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.112903 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.112920 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.215630 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.215700 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.215757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.215789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.215809 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.319526 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.319603 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.319624 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.319656 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.319676 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.423006 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.423110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.423126 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.423152 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.423169 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.526856 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.526915 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.526930 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.526954 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.526969 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.544682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.544757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.544773 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:51 crc kubenswrapper[4764]: E1203 23:42:51.544867 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:51 crc kubenswrapper[4764]: E1203 23:42:51.544965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:51 crc kubenswrapper[4764]: E1203 23:42:51.545145 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.630353 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.630420 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.630438 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.630467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.630547 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.733211 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.733271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.733291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.733318 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.733335 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.837467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.837554 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.837577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.837609 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.837628 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.941577 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.941634 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.941650 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.941675 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:51 crc kubenswrapper[4764]: I1203 23:42:51.941692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:51Z","lastTransitionTime":"2025-12-03T23:42:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.045292 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.045378 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.045404 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.045435 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.045456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.149090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.149150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.149168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.149192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.149208 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.252421 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.252491 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.252513 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.252541 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.252562 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.355781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.355844 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.355861 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.355885 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.355903 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.458259 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.458323 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.458344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.458371 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.458388 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.545925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:52 crc kubenswrapper[4764]: E1203 23:42:52.546185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.561467 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.561536 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.561555 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.561579 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.561597 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.664179 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.664232 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.664247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.664268 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.664282 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.767411 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.767492 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.767548 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.767580 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.767602 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.871035 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.871117 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.871141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.871184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.871210 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.974181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.974226 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.974248 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.974265 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:52 crc kubenswrapper[4764]: I1203 23:42:52.974277 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:52Z","lastTransitionTime":"2025-12-03T23:42:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.077789 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.077860 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.077878 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.077904 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.077920 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.179932 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.179978 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.179989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.180083 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.180097 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.282572 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.282627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.282644 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.282668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.282683 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.385884 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.385941 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.385959 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.385984 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.386000 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.488888 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.488949 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.488970 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.489000 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.489022 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.545662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.545757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:53 crc kubenswrapper[4764]: E1203 23:42:53.545902 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.545951 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:53 crc kubenswrapper[4764]: E1203 23:42:53.546072 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:53 crc kubenswrapper[4764]: E1203 23:42:53.546179 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.591815 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.591864 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.591876 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.591894 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.591906 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.694670 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.694741 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.694753 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.694772 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.694784 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.797896 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.798080 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.798111 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.798137 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.798157 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.900685 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.900774 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.900793 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.900820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:53 crc kubenswrapper[4764]: I1203 23:42:53.900879 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:53Z","lastTransitionTime":"2025-12-03T23:42:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.005652 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.005769 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.005794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.005829 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.005856 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.108964 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.109011 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.109021 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.109040 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.109052 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.215278 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.215331 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.215348 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.215374 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.215390 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.318359 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.318399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.318416 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.318437 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.318452 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.421627 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.421684 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.421701 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.421751 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.421768 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.528992 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.529051 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.529070 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.529097 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.529115 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.544919 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:54 crc kubenswrapper[4764]: E1203 23:42:54.545664 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.546087 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:42:54 crc kubenswrapper[4764]: E1203 23:42:54.546372 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jc5ck_openshift-ovn-kubernetes(9d56d81d-b8c8-43d2-a678-d34d2ae54e64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.597570 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=28.597542596 podStartE2EDuration="28.597542596s" podCreationTimestamp="2025-12-03 23:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.596767664 +0000 UTC m=+110.358092165" watchObservedRunningTime="2025-12-03 23:42:54.597542596 +0000 UTC m=+110.358867047" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.632256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.632362 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.632450 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.632511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.632530 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.687953 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=92.68792356 podStartE2EDuration="1m32.68792356s" podCreationTimestamp="2025-12-03 23:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.665941093 +0000 UTC m=+110.427265544" watchObservedRunningTime="2025-12-03 23:42:54.68792356 +0000 UTC m=+110.449248011" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.688165 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=90.688157416 podStartE2EDuration="1m30.688157416s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.687604561 +0000 UTC m=+110.448929032" watchObservedRunningTime="2025-12-03 23:42:54.688157416 +0000 UTC m=+110.449481857" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.703120 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podStartSLOduration=91.703088728 podStartE2EDuration="1m31.703088728s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.702935894 +0000 UTC m=+110.464260355" watchObservedRunningTime="2025-12-03 23:42:54.703088728 +0000 UTC m=+110.464413169" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.731695 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xc6rn" podStartSLOduration=91.731660707 podStartE2EDuration="1m31.731660707s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.731005288 +0000 UTC m=+110.492329769" watchObservedRunningTime="2025-12-03 23:42:54.731660707 +0000 UTC m=+110.492985168" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.736639 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.736686 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.736705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.736757 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.736776 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.754100 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bbbnd" podStartSLOduration=91.754071395 podStartE2EDuration="1m31.754071395s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.75352341 +0000 UTC m=+110.514847861" watchObservedRunningTime="2025-12-03 23:42:54.754071395 +0000 UTC m=+110.515395836" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.791589 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-94tkt" podStartSLOduration=90.79156286 podStartE2EDuration="1m30.79156286s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.773177022 +0000 UTC m=+110.534501483" watchObservedRunningTime="2025-12-03 23:42:54.79156286 +0000 UTC m=+110.552887311" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.791999 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=30.791989941 podStartE2EDuration="30.791989941s" podCreationTimestamp="2025-12-03 23:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.790712276 +0000 UTC m=+110.552036757" watchObservedRunningTime="2025-12-03 23:42:54.791989941 +0000 UTC m=+110.553314392" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.840560 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.840597 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.840608 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.840625 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.840636 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.851523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ck5p6" podStartSLOduration=91.851490703 podStartE2EDuration="1m31.851490703s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.849694524 +0000 UTC m=+110.611018945" watchObservedRunningTime="2025-12-03 23:42:54.851490703 +0000 UTC m=+110.612815154" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.886926 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xj964" podStartSLOduration=91.8868995 podStartE2EDuration="1m31.8868995s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.871921977 +0000 UTC m=+110.633246458" watchObservedRunningTime="2025-12-03 23:42:54.8868995 +0000 UTC m=+110.648223921" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.917674 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=64.917649699 podStartE2EDuration="1m4.917649699s" podCreationTimestamp="2025-12-03 23:41:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:42:54.902354017 +0000 UTC m=+110.663678488" watchObservedRunningTime="2025-12-03 23:42:54.917649699 +0000 UTC m=+110.678974130" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.942689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.942778 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.942794 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.942814 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:54 crc kubenswrapper[4764]: I1203 23:42:54.942828 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:54Z","lastTransitionTime":"2025-12-03T23:42:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.049039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.049113 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.049144 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.049174 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.049193 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.152013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.152065 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.152081 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.152102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.152117 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.253935 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.253972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.253983 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.253998 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.254009 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.356707 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.356797 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.356817 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.356841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.356858 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.459873 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.459944 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.459969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.459996 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.460014 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.545299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.545299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.545373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:55 crc kubenswrapper[4764]: E1203 23:42:55.545511 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:55 crc kubenswrapper[4764]: E1203 23:42:55.545794 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:55 crc kubenswrapper[4764]: E1203 23:42:55.545657 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.563187 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.563244 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.563263 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.563285 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.563302 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.665463 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.665514 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.665530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.665553 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.665570 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.768909 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.768972 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.768989 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.769013 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.769030 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.871066 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.871110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.871122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.871141 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.871155 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.974039 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.974079 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.974092 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.974110 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:55 crc kubenswrapper[4764]: I1203 23:42:55.974123 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:55Z","lastTransitionTime":"2025-12-03T23:42:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.078130 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.078184 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.078197 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.078215 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.078226 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.180969 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.181042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.181067 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.181096 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.181119 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.284193 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.284256 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.284274 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.284301 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.284318 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.387665 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.387781 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.387816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.387841 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.387857 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.490792 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.490851 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.490874 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.490905 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.490925 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.545067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:56 crc kubenswrapper[4764]: E1203 23:42:56.545265 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.593749 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.593802 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.593816 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.593835 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.593849 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.696604 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.696668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.696693 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.696767 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.696796 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.800181 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.800267 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.800304 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.800335 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.800357 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.903336 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.903396 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.903413 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.903436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:56 crc kubenswrapper[4764]: I1203 23:42:56.903452 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:56Z","lastTransitionTime":"2025-12-03T23:42:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.006048 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.006102 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.006118 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.006142 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.006159 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.109150 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.109227 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.109247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.109279 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.109296 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.212710 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.212883 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.212899 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.212923 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.212941 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.315615 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.315689 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.315712 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.315776 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.315802 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.429539 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.429828 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.429858 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.429890 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.429925 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.534358 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.534427 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.534452 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.534480 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.534501 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.544708 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.544713 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:57 crc kubenswrapper[4764]: E1203 23:42:57.544931 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:57 crc kubenswrapper[4764]: E1203 23:42:57.545060 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.545249 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:57 crc kubenswrapper[4764]: E1203 23:42:57.545338 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.637399 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.637429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.637436 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.637449 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.637456 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.676443 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/1.log" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.677131 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/0.log" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.677205 4764 generic.go:334] "Generic (PLEG): container finished" podID="8789b456-ab23-4316-880d-5c02242cd3fd" containerID="dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4" exitCode=1 Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.677248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerDied","Data":"dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.677297 4764 scope.go:117] "RemoveContainer" containerID="b0256da53d9cc7dfc63837d808ef4dffe667c56b037af626782fed9dfc534bbc" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.678160 4764 scope.go:117] "RemoveContainer" containerID="dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4" Dec 03 23:42:57 crc kubenswrapper[4764]: E1203 23:42:57.678630 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xj964_openshift-multus(8789b456-ab23-4316-880d-5c02242cd3fd)\"" pod="openshift-multus/multus-xj964" podUID="8789b456-ab23-4316-880d-5c02242cd3fd" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.740946 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.741042 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.741061 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.741090 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.741108 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.844168 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.844230 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.844247 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.844271 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.844288 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.947342 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.947429 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.947483 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.947507 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:57 crc kubenswrapper[4764]: I1203 23:42:57.947527 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:57Z","lastTransitionTime":"2025-12-03T23:42:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.050588 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.050668 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.050683 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.050705 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.050742 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.153063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.153100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.153109 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.153122 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.153131 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.255612 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.255645 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.255654 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.255682 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.255692 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.358975 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.359044 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.359063 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.359530 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.359599 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.463100 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.463167 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.463192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.463222 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.463244 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.545814 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:42:58 crc kubenswrapper[4764]: E1203 23:42:58.546117 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.567124 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.567201 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.567214 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.567236 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.567254 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.670477 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.670525 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.670542 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.670565 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.670581 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.684252 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/1.log" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.773931 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.773995 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.774023 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.774052 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.774077 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.877062 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.877123 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.877158 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.877192 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.877217 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.979551 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.979618 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.979640 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.979669 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:58 crc kubenswrapper[4764]: I1203 23:42:58.979690 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:58Z","lastTransitionTime":"2025-12-03T23:42:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.082228 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.082275 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.082310 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.082329 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.082342 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.185687 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.185796 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.185811 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.185827 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.185839 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.289291 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.289344 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.289356 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.289391 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.289403 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.392138 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.392199 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.392217 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.392245 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.392264 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.495511 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.495570 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.495585 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.495601 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.495611 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.545709 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.545823 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.545836 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:42:59 crc kubenswrapper[4764]: E1203 23:42:59.546251 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:42:59 crc kubenswrapper[4764]: E1203 23:42:59.546404 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:42:59 crc kubenswrapper[4764]: E1203 23:42:59.546535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.598865 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.598910 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.598925 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.598945 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.598962 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.622068 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.622121 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.622133 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.622151 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.622165 4764 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T23:42:59Z","lastTransitionTime":"2025-12-03T23:42:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.682945 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67"] Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.683621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.687611 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.687615 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.690119 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.693655 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.835696 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.835952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ede8974-de2d-442b-aff4-7b36b5194675-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.836021 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.836087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ede8974-de2d-442b-aff4-7b36b5194675-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.836165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ede8974-de2d-442b-aff4-7b36b5194675-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ede8974-de2d-442b-aff4-7b36b5194675-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937675 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ede8974-de2d-442b-aff4-7b36b5194675-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ede8974-de2d-442b-aff4-7b36b5194675-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.937869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/9ede8974-de2d-442b-aff4-7b36b5194675-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.938950 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9ede8974-de2d-442b-aff4-7b36b5194675-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.951949 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ede8974-de2d-442b-aff4-7b36b5194675-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:42:59 crc kubenswrapper[4764]: I1203 23:42:59.959055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9ede8974-de2d-442b-aff4-7b36b5194675-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4gg67\" (UID: \"9ede8974-de2d-442b-aff4-7b36b5194675\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:43:00 crc kubenswrapper[4764]: I1203 23:43:00.011051 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" Dec 03 23:43:00 crc kubenswrapper[4764]: W1203 23:43:00.032886 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ede8974_de2d_442b_aff4_7b36b5194675.slice/crio-1dba5d3e684fa02bfe3ebd118931d360ebf5aba5065f174181ac04264af69624 WatchSource:0}: Error finding container 1dba5d3e684fa02bfe3ebd118931d360ebf5aba5065f174181ac04264af69624: Status 404 returned error can't find the container with id 1dba5d3e684fa02bfe3ebd118931d360ebf5aba5065f174181ac04264af69624 Dec 03 23:43:00 crc kubenswrapper[4764]: I1203 23:43:00.545533 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:00 crc kubenswrapper[4764]: E1203 23:43:00.545986 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:00 crc kubenswrapper[4764]: I1203 23:43:00.696812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" event={"ID":"9ede8974-de2d-442b-aff4-7b36b5194675","Type":"ContainerStarted","Data":"599beaccad42058a7a875da0e398871ee624c156493526973323000444fc90e2"} Dec 03 23:43:00 crc kubenswrapper[4764]: I1203 23:43:00.697006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" event={"ID":"9ede8974-de2d-442b-aff4-7b36b5194675","Type":"ContainerStarted","Data":"1dba5d3e684fa02bfe3ebd118931d360ebf5aba5065f174181ac04264af69624"} Dec 03 23:43:01 crc kubenswrapper[4764]: I1203 23:43:01.545302 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:01 crc kubenswrapper[4764]: I1203 23:43:01.545302 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:01 crc kubenswrapper[4764]: I1203 23:43:01.545487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:01 crc kubenswrapper[4764]: E1203 23:43:01.545758 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:01 crc kubenswrapper[4764]: E1203 23:43:01.545885 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:01 crc kubenswrapper[4764]: E1203 23:43:01.546031 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:02 crc kubenswrapper[4764]: I1203 23:43:02.544840 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:02 crc kubenswrapper[4764]: E1203 23:43:02.545020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:03 crc kubenswrapper[4764]: I1203 23:43:03.545296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:03 crc kubenswrapper[4764]: I1203 23:43:03.545363 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:03 crc kubenswrapper[4764]: I1203 23:43:03.545466 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:03 crc kubenswrapper[4764]: E1203 23:43:03.545636 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:03 crc kubenswrapper[4764]: E1203 23:43:03.545866 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:03 crc kubenswrapper[4764]: E1203 23:43:03.546038 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:04 crc kubenswrapper[4764]: I1203 23:43:04.544961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:04 crc kubenswrapper[4764]: E1203 23:43:04.547123 4764 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 23:43:04 crc kubenswrapper[4764]: E1203 23:43:04.547187 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:04 crc kubenswrapper[4764]: E1203 23:43:04.622323 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:43:05 crc kubenswrapper[4764]: I1203 23:43:05.545037 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:05 crc kubenswrapper[4764]: I1203 23:43:05.545113 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:05 crc kubenswrapper[4764]: I1203 23:43:05.545225 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:05 crc kubenswrapper[4764]: E1203 23:43:05.545459 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:05 crc kubenswrapper[4764]: E1203 23:43:05.545799 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:05 crc kubenswrapper[4764]: E1203 23:43:05.545965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:06 crc kubenswrapper[4764]: I1203 23:43:06.545316 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:06 crc kubenswrapper[4764]: E1203 23:43:06.545548 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:07 crc kubenswrapper[4764]: I1203 23:43:07.545266 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:07 crc kubenswrapper[4764]: I1203 23:43:07.545405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:07 crc kubenswrapper[4764]: E1203 23:43:07.545469 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:07 crc kubenswrapper[4764]: I1203 23:43:07.545266 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:07 crc kubenswrapper[4764]: E1203 23:43:07.545589 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:07 crc kubenswrapper[4764]: E1203 23:43:07.545883 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:08 crc kubenswrapper[4764]: I1203 23:43:08.545610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:08 crc kubenswrapper[4764]: E1203 23:43:08.545936 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:09 crc kubenswrapper[4764]: I1203 23:43:09.545475 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:09 crc kubenswrapper[4764]: I1203 23:43:09.545532 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:09 crc kubenswrapper[4764]: I1203 23:43:09.545664 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:09 crc kubenswrapper[4764]: E1203 23:43:09.545934 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:09 crc kubenswrapper[4764]: E1203 23:43:09.546880 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:09 crc kubenswrapper[4764]: E1203 23:43:09.547180 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:09 crc kubenswrapper[4764]: I1203 23:43:09.547599 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:43:09 crc kubenswrapper[4764]: E1203 23:43:09.624402 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:43:09 crc kubenswrapper[4764]: I1203 23:43:09.732031 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/3.log" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.545565 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:10 crc kubenswrapper[4764]: E1203 23:43:10.545685 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.556690 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4gg67" podStartSLOduration=107.556675 podStartE2EDuration="1m47.556675s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:00.710922185 +0000 UTC m=+116.472246596" watchObservedRunningTime="2025-12-03 23:43:10.556675 +0000 UTC m=+126.317999401" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.556991 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9fkg4"] Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.744100 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/3.log" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.747510 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.747526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerStarted","Data":"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c"} Dec 03 23:43:10 crc kubenswrapper[4764]: E1203 23:43:10.747640 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.749000 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:43:10 crc kubenswrapper[4764]: I1203 23:43:10.780755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podStartSLOduration=107.780737683 podStartE2EDuration="1m47.780737683s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:10.779666214 +0000 UTC m=+126.540990665" watchObservedRunningTime="2025-12-03 23:43:10.780737683 +0000 UTC m=+126.542062104" Dec 03 23:43:11 crc kubenswrapper[4764]: I1203 23:43:11.545511 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:11 crc kubenswrapper[4764]: I1203 23:43:11.545592 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:11 crc kubenswrapper[4764]: I1203 23:43:11.545527 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:11 crc kubenswrapper[4764]: E1203 23:43:11.545844 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:11 crc kubenswrapper[4764]: E1203 23:43:11.545971 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:11 crc kubenswrapper[4764]: E1203 23:43:11.546163 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:12 crc kubenswrapper[4764]: I1203 23:43:12.545095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:12 crc kubenswrapper[4764]: E1203 23:43:12.545495 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:12 crc kubenswrapper[4764]: I1203 23:43:12.545594 4764 scope.go:117] "RemoveContainer" containerID="dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4" Dec 03 23:43:13 crc kubenswrapper[4764]: I1203 23:43:13.545277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:13 crc kubenswrapper[4764]: I1203 23:43:13.545319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:13 crc kubenswrapper[4764]: I1203 23:43:13.545277 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:13 crc kubenswrapper[4764]: E1203 23:43:13.545503 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:13 crc kubenswrapper[4764]: E1203 23:43:13.545654 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:13 crc kubenswrapper[4764]: E1203 23:43:13.545771 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:13 crc kubenswrapper[4764]: I1203 23:43:13.762074 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/1.log" Dec 03 23:43:13 crc kubenswrapper[4764]: I1203 23:43:13.762174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerStarted","Data":"3b7a10a11e2b6f0c7c42801239417c3406c7425f92d20b8cddf777b62c812032"} Dec 03 23:43:14 crc kubenswrapper[4764]: I1203 23:43:14.545214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:14 crc kubenswrapper[4764]: E1203 23:43:14.547050 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:14 crc kubenswrapper[4764]: E1203 23:43:14.625372 4764 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:43:15 crc kubenswrapper[4764]: I1203 23:43:15.544755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:15 crc kubenswrapper[4764]: I1203 23:43:15.544846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:15 crc kubenswrapper[4764]: I1203 23:43:15.544778 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:15 crc kubenswrapper[4764]: E1203 23:43:15.544961 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:15 crc kubenswrapper[4764]: E1203 23:43:15.545243 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:15 crc kubenswrapper[4764]: E1203 23:43:15.545132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:16 crc kubenswrapper[4764]: I1203 23:43:16.545009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:16 crc kubenswrapper[4764]: E1203 23:43:16.545219 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:17 crc kubenswrapper[4764]: I1203 23:43:17.544973 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:17 crc kubenswrapper[4764]: I1203 23:43:17.545027 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:17 crc kubenswrapper[4764]: I1203 23:43:17.545091 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:17 crc kubenswrapper[4764]: E1203 23:43:17.545170 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:17 crc kubenswrapper[4764]: E1203 23:43:17.545330 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:17 crc kubenswrapper[4764]: E1203 23:43:17.545419 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:18 crc kubenswrapper[4764]: I1203 23:43:18.545710 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:18 crc kubenswrapper[4764]: E1203 23:43:18.545952 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9fkg4" podUID="acd1bf47-f475-47f3-95a7-2e0cecec15aa" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.544650 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.544796 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.544668 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:19 crc kubenswrapper[4764]: E1203 23:43:19.544907 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 23:43:19 crc kubenswrapper[4764]: E1203 23:43:19.545028 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 23:43:19 crc kubenswrapper[4764]: E1203 23:43:19.545176 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.946820 4764 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.996423 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tl5lg"] Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.997807 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:19 crc kubenswrapper[4764]: I1203 23:43:19.998093 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.001560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.011374 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.011445 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.011573 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.011612 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.012193 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.013689 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.014014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.014108 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.014946 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.015267 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.015639 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.015948 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.018449 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vlqkp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.019183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.021041 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.021578 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.021979 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.022839 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.028822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.029611 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.030534 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.031263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.037104 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.039132 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.040317 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.066760 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.067419 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.068082 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.068830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.069775 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.070176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.071208 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.072783 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.072814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.072795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.073091 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.073180 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.073441 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.074540 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjz4g\" (UniqueName: \"kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-auth-proxy-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-audit-policies\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-serving-cert\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhn6\" (UniqueName: \"kubernetes.io/projected/82bbef02-a9d4-42e3-a874-f702e232be80-kube-api-access-4nhn6\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-client\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll5hm\" (UniqueName: \"kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5275f70a-3cd1-4955-8e28-6027b725376d-audit-dir\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081744 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckxt\" (UniqueName: \"kubernetes.io/projected/5275f70a-3cd1-4955-8e28-6027b725376d-kube-api-access-bckxt\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82bbef02-a9d4-42e3-a874-f702e232be80-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081859 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081920 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x5xw\" (UniqueName: \"kubernetes.io/projected/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-kube-api-access-8x5xw\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-node-pullsecrets\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081961 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.081982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82bbef02-a9d4-42e3-a874-f702e232be80-serving-cert\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.082002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-machine-approver-tls\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.082024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-encryption-config\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.082550 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.082761 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.082980 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.083599 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.084143 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.084895 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mh8l6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.085171 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.086316 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.088272 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.088642 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.089049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.089265 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.089284 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.090094 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.090171 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091358 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091421 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091602 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091645 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091692 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091740 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.091894 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.092019 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.092038 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.092138 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.092268 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.092972 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.093080 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.093265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.095620 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.102631 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.106482 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-glq4z"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.107063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.108096 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.108920 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109041 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109080 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109053 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109054 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109629 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.109887 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.111019 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-f2nwm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.112842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.113055 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.118500 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cgmmp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.118909 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119126 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119249 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119270 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119561 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119667 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.119989 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.120141 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.120678 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.123117 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.123285 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.125705 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.126396 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.126673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.126682 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.127253 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4js5f"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.141330 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wjsw"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.141947 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tl5lg"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.141973 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.142591 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.143621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.143890 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.144290 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.145454 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.148460 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152001 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152494 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wxtcz"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152926 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.152970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.153379 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.153576 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.153602 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.153805 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.154020 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.154329 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.154535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.155065 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.155688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.157425 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.157531 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.157662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.158049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.158381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.158600 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.159948 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.161384 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.161501 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.161888 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.162283 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.163168 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.163397 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.163439 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.164370 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.164637 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165051 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165173 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165309 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165649 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-66kxp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.165744 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.166826 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.166954 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168658 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168691 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168801 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168844 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168859 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.168954 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.169492 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.169910 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.171039 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.171121 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.180210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.180499 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.181111 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vlqkp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.181135 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fzmrn"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.181261 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.181653 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182017 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182095 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182352 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182478 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x5xw\" (UniqueName: \"kubernetes.io/projected/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-kube-api-access-8x5xw\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182603 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-node-pullsecrets\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182639 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182997 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182999 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.184102 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188623 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188647 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-node-pullsecrets\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188796 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.182640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-images\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82bbef02-a9d4-42e3-a874-f702e232be80-serving-cert\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-machine-approver-tls\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-image-import-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.188980 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-encryption-config\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-config\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189049 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjz4g\" (UniqueName: \"kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-auth-proxy-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-encryption-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189118 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qq8j\" (UniqueName: \"kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-audit-policies\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr54\" (UniqueName: \"kubernetes.io/projected/d4a5d117-9aa6-4b48-8862-2be01934454a-kube-api-access-wgr54\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-serving-cert\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhn6\" (UniqueName: \"kubernetes.io/projected/82bbef02-a9d4-42e3-a874-f702e232be80-kube-api-access-4nhn6\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-client\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll5hm\" (UniqueName: \"kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5275f70a-3cd1-4955-8e28-6027b725376d-audit-dir\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-client\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189601 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bckxt\" (UniqueName: \"kubernetes.io/projected/5275f70a-3cd1-4955-8e28-6027b725376d-kube-api-access-bckxt\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189670 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82bbef02-a9d4-42e3-a874-f702e232be80-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-serving-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189733 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189754 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-serving-cert\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a5d117-9aa6-4b48-8862-2be01934454a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit-dir\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.189880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4vmv\" (UniqueName: \"kubernetes.io/projected/e8324e0f-a676-4015-bbcb-cf68235eb72a-kube-api-access-t4vmv\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.190211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5275f70a-3cd1-4955-8e28-6027b725376d-audit-dir\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.191185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-auth-proxy-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.191366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/82bbef02-a9d4-42e3-a874-f702e232be80-available-featuregates\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.192075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.192499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.192850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.193253 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-encryption-config\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.193334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82bbef02-a9d4-42e3-a874-f702e232be80-serving-cert\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.194006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.194349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-machine-approver-tls\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.195411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-config\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.194596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5275f70a-3cd1-4955-8e28-6027b725376d-audit-policies\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.198267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.206920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207113 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-serving-cert\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207274 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.207659 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.208058 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.208089 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mh8l6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.208188 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.208740 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.208969 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-glq4z"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.210487 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.211038 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5275f70a-3cd1-4955-8e28-6027b725376d-etcd-client\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.211536 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f2nwm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.212806 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wjsw"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.213825 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-sm25h"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.214820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.215004 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.216271 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.219648 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.224528 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.225588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.227058 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.230070 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.231802 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.232563 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4js5f"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.233582 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.235652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.236775 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.237791 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.238791 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.239819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.240857 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cgmmp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.241901 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.242006 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.242936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.244112 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.245165 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-sm25h"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.246365 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2lqqs"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.247090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.247406 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.248414 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-d7hth"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.249064 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.249433 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.250478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.251506 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fzmrn"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.252516 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-66kxp"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.253492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.254699 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.255819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.256852 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lqqs"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.257961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.258941 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tw5p6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.259508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.260050 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tw5p6"] Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.262051 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.282261 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-client\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290530 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d27877-3051-420c-a43e-f12be4e82450-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290562 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-serving-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-serving-cert\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290595 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a5d117-9aa6-4b48-8862-2be01934454a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290626 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit-dir\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4vmv\" (UniqueName: \"kubernetes.io/projected/e8324e0f-a676-4015-bbcb-cf68235eb72a-kube-api-access-t4vmv\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91d27877-3051-420c-a43e-f12be4e82450-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-images\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-image-import-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-config\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd2d9\" (UniqueName: \"kubernetes.io/projected/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-kube-api-access-wd2d9\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290840 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7d44\" (UniqueName: \"kubernetes.io/projected/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-kube-api-access-b7d44\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-encryption-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290882 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qq8j\" (UniqueName: \"kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr54\" (UniqueName: \"kubernetes.io/projected/d4a5d117-9aa6-4b48-8862-2be01934454a-kube-api-access-wgr54\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r924j\" (UniqueName: \"kubernetes.io/projected/0af14017-eba4-401e-9f16-d4a7fd22b6b8-kube-api-access-r924j\") pod \"migrator-59844c95c7-4ln9l\" (UID: \"0af14017-eba4-401e-9f16-d4a7fd22b6b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.290980 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291053 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthpk\" (UniqueName: \"kubernetes.io/projected/91d27877-3051-420c-a43e-f12be4e82450-kube-api-access-xthpk\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mx79\" (UniqueName: \"kubernetes.io/projected/d64b774a-681b-4c0f-b2eb-36398275e451-kube-api-access-5mx79\") pod \"downloads-7954f5f757-f2nwm\" (UID: \"d64b774a-681b-4c0f-b2eb-36398275e451\") " pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-serving-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.291802 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-image-import-ca\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-config\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit-dir\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-audit\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.292828 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.293421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8324e0f-a676-4015-bbcb-cf68235eb72a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.293454 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d4a5d117-9aa6-4b48-8862-2be01934454a-images\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.294264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.294395 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d4a5d117-9aa6-4b48-8862-2be01934454a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.295155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.295701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-etcd-client\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.296739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-encryption-config\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.302898 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.308606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8324e0f-a676-4015-bbcb-cf68235eb72a-serving-cert\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.322613 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.342608 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.364894 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.383696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mx79\" (UniqueName: \"kubernetes.io/projected/d64b774a-681b-4c0f-b2eb-36398275e451-kube-api-access-5mx79\") pod \"downloads-7954f5f757-f2nwm\" (UID: \"d64b774a-681b-4c0f-b2eb-36398275e451\") " pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392577 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthpk\" (UniqueName: \"kubernetes.io/projected/91d27877-3051-420c-a43e-f12be4e82450-kube-api-access-xthpk\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d27877-3051-420c-a43e-f12be4e82450-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392675 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392783 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91d27877-3051-420c-a43e-f12be4e82450-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd2d9\" (UniqueName: \"kubernetes.io/projected/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-kube-api-access-wd2d9\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7d44\" (UniqueName: \"kubernetes.io/projected/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-kube-api-access-b7d44\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.392982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r924j\" (UniqueName: \"kubernetes.io/projected/0af14017-eba4-401e-9f16-d4a7fd22b6b8-kube-api-access-r924j\") pod \"migrator-59844c95c7-4ln9l\" (UID: \"0af14017-eba4-401e-9f16-d4a7fd22b6b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.393400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91d27877-3051-420c-a43e-f12be4e82450-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.397186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/91d27877-3051-420c-a43e-f12be4e82450-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.397402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.413480 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.423016 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.442419 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.462185 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.482590 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.502671 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.522362 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.544289 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.544800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.562883 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.583647 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.603135 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.622966 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.643615 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.662536 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.683207 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.702804 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.723140 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.754015 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.763466 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.782827 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.802597 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.823801 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.843220 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.863369 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.884074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.902884 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.922574 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.943356 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.963923 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 23:43:20 crc kubenswrapper[4764]: I1203 23:43:20.983563 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.003514 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.022992 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.064155 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.084163 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.103863 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.123430 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.142991 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.162589 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.181491 4764 request.go:700] Waited for 1.01613162s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/secrets?fieldSelector=metadata.name%3Dkube-storage-version-migrator-sa-dockercfg-5xfcg&limit=500&resourceVersion=0 Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.183668 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.203788 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.223809 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.243167 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.263997 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.283608 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.302670 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.323980 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.343375 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.363342 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.383179 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: E1203 23:43:21.394180 4764 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 03 23:43:21 crc kubenswrapper[4764]: E1203 23:43:21.394215 4764 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Dec 03 23:43:21 crc kubenswrapper[4764]: E1203 23:43:21.394314 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle podName:72a14adc-ebea-42b0-bd10-1a82d58d7c0e nodeName:}" failed. No retries permitted until 2025-12-03 23:43:21.894279398 +0000 UTC m=+137.655603839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle") pod "service-ca-9c57cc56f-fzmrn" (UID: "72a14adc-ebea-42b0-bd10-1a82d58d7c0e") : failed to sync configmap cache: timed out waiting for the condition Dec 03 23:43:21 crc kubenswrapper[4764]: E1203 23:43:21.394354 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key podName:72a14adc-ebea-42b0-bd10-1a82d58d7c0e nodeName:}" failed. No retries permitted until 2025-12-03 23:43:21.894334799 +0000 UTC m=+137.655659400 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key") pod "service-ca-9c57cc56f-fzmrn" (UID: "72a14adc-ebea-42b0-bd10-1a82d58d7c0e") : failed to sync secret cache: timed out waiting for the condition Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.403793 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.423429 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.443090 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.463001 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.482618 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.503831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.523499 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.543899 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.544971 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.544985 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.545464 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.562908 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.583022 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.603258 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.623527 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.644378 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.690383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x5xw\" (UniqueName: \"kubernetes.io/projected/5baf5265-7d93-41ff-a1a9-1cdeda3e38f3-kube-api-access-8x5xw\") pod \"machine-approver-56656f9798-kqj2v\" (UID: \"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.731306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjz4g\" (UniqueName: \"kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g\") pod \"route-controller-manager-6576b87f9c-2sfl6\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.751586 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckxt\" (UniqueName: \"kubernetes.io/projected/5275f70a-3cd1-4955-8e28-6027b725376d-kube-api-access-bckxt\") pod \"apiserver-7bbb656c7d-jr756\" (UID: \"5275f70a-3cd1-4955-8e28-6027b725376d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.763627 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.764619 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.773592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhn6\" (UniqueName: \"kubernetes.io/projected/82bbef02-a9d4-42e3-a874-f702e232be80-kube-api-access-4nhn6\") pod \"openshift-config-operator-7777fb866f-n7jnx\" (UID: \"82bbef02-a9d4-42e3-a874-f702e232be80\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.803712 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.813073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll5hm\" (UniqueName: \"kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm\") pod \"console-f9d7485db-l7xdw\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.823040 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.853056 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.862933 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.885052 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.905629 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.905640 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.910685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.910761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.912617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-cabundle\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.918032 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-signing-key\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.924705 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:43:21 crc kubenswrapper[4764]: W1203 23:43:21.925140 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5baf5265_7d93_41ff_a1a9_1cdeda3e38f3.slice/crio-f2129f62fabe2f4d6ee88a2dfce7791967e0c54cee14af6e78c3e16c3cc3afda WatchSource:0}: Error finding container f2129f62fabe2f4d6ee88a2dfce7791967e0c54cee14af6e78c3e16c3cc3afda: Status 404 returned error can't find the container with id f2129f62fabe2f4d6ee88a2dfce7791967e0c54cee14af6e78c3e16c3cc3afda Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.928096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.943571 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.964062 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.965228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.976457 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:21 crc kubenswrapper[4764]: I1203 23:43:21.983140 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.003108 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.026770 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.045618 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.045942 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.065944 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.083560 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.102879 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.123425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.143790 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.163049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.163812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx"] Dec 03 23:43:22 crc kubenswrapper[4764]: W1203 23:43:22.174453 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82bbef02_a9d4_42e3_a874_f702e232be80.slice/crio-a684d88244b2f92c1b9a9f1be9b2fb35428ae54f9b80b9407dfbfff78d49e2e2 WatchSource:0}: Error finding container a684d88244b2f92c1b9a9f1be9b2fb35428ae54f9b80b9407dfbfff78d49e2e2: Status 404 returned error can't find the container with id a684d88244b2f92c1b9a9f1be9b2fb35428ae54f9b80b9407dfbfff78d49e2e2 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.175548 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.183378 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: W1203 23:43:22.196625 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6590d259_1d9c_41e2_b070_5e7a1fa53d34.slice/crio-3e7a95eb0c7c2d3d30c41e2fcdbecb1d7689d7cc86259105d2329bbc5d42d26d WatchSource:0}: Error finding container 3e7a95eb0c7c2d3d30c41e2fcdbecb1d7689d7cc86259105d2329bbc5d42d26d: Status 404 returned error can't find the container with id 3e7a95eb0c7c2d3d30c41e2fcdbecb1d7689d7cc86259105d2329bbc5d42d26d Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.202258 4764 request.go:700] Waited for 1.942476681s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.204641 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.206081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.245298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr54\" (UniqueName: \"kubernetes.io/projected/d4a5d117-9aa6-4b48-8862-2be01934454a-kube-api-access-wgr54\") pod \"machine-api-operator-5694c8668f-vlqkp\" (UID: \"d4a5d117-9aa6-4b48-8862-2be01934454a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:22 crc kubenswrapper[4764]: W1203 23:43:22.249084 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26ed3c8_0bba_40a7_a18a_e8718b336dcc.slice/crio-915e11b7b27fbb8e0b530162730f310121c20f41c833492cc0a94f37c6346c27 WatchSource:0}: Error finding container 915e11b7b27fbb8e0b530162730f310121c20f41c833492cc0a94f37c6346c27: Status 404 returned error can't find the container with id 915e11b7b27fbb8e0b530162730f310121c20f41c833492cc0a94f37c6346c27 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.255867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qq8j\" (UniqueName: \"kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j\") pod \"controller-manager-879f6c89f-bn7zt\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.278813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4vmv\" (UniqueName: \"kubernetes.io/projected/e8324e0f-a676-4015-bbcb-cf68235eb72a-kube-api-access-t4vmv\") pod \"apiserver-76f77b778f-tl5lg\" (UID: \"e8324e0f-a676-4015-bbcb-cf68235eb72a\") " pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.296956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthpk\" (UniqueName: \"kubernetes.io/projected/91d27877-3051-420c-a43e-f12be4e82450-kube-api-access-xthpk\") pod \"openshift-apiserver-operator-796bbdcf4f-kxpmm\" (UID: \"91d27877-3051-420c-a43e-f12be4e82450\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.317153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mx79\" (UniqueName: \"kubernetes.io/projected/d64b774a-681b-4c0f-b2eb-36398275e451-kube-api-access-5mx79\") pod \"downloads-7954f5f757-f2nwm\" (UID: \"d64b774a-681b-4c0f-b2eb-36398275e451\") " pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.330996 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.346279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r924j\" (UniqueName: \"kubernetes.io/projected/0af14017-eba4-401e-9f16-d4a7fd22b6b8-kube-api-access-r924j\") pod \"migrator-59844c95c7-4ln9l\" (UID: \"0af14017-eba4-401e-9f16-d4a7fd22b6b8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.358325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd2d9\" (UniqueName: \"kubernetes.io/projected/72a14adc-ebea-42b0-bd10-1a82d58d7c0e-kube-api-access-wd2d9\") pod \"service-ca-9c57cc56f-fzmrn\" (UID: \"72a14adc-ebea-42b0-bd10-1a82d58d7c0e\") " pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.376825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7d44\" (UniqueName: \"kubernetes.io/projected/fb55c6e2-2206-4735-ba1a-bfbff1e7549a-kube-api-access-b7d44\") pod \"cluster-samples-operator-665b6dd947-j6rcl\" (UID: \"fb55c6e2-2206-4735-ba1a-bfbff1e7549a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.383344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.388961 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.402841 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.419696 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.442870 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.448072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.463528 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.474401 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.483314 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.503624 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.526594 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.583463 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.600346 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.614495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.628171 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.628240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.628340 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9c9e052-a750-4b4f-873c-9808c0a3c75b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.628376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.629308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-client\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.630580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.631087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.631989 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635796 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-srv-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e21b87e1-9839-4439-89ac-32f6a196774b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635872 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-profile-collector-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdzl\" (UniqueName: \"kubernetes.io/projected/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-kube-api-access-qsdzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.635971 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.636014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.636043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.636071 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.636096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed746947-b1ba-426d-92a8-02db2a949e4b-service-ca-bundle\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.638367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.638417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.638466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75845c6a-1b60-4846-a1ef-7e719c5ce398-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-metrics-tls\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645131 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d83a32b-c8fc-438b-9797-664c5a2c5360-serving-cert\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645149 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-service-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-stats-auth\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8jw\" (UniqueName: \"kubernetes.io/projected/ed746947-b1ba-426d-92a8-02db2a949e4b-kube-api-access-8b8jw\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163c97f5-601b-47dd-8653-5dffdf157659-metrics-tls\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-service-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f919d5d0-c940-484d-9017-ada61860924a-proxy-tls\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75845c6a-1b60-4846-a1ef-7e719c5ce398-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163c97f5-601b-47dd-8653-5dffdf157659-trusted-ca\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.645980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpw4j\" (UniqueName: \"kubernetes.io/projected/d7ae1964-7175-4a71-a02b-8ff8a274a5da-kube-api-access-lpw4j\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646044 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646078 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.646957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647070 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-default-certificate\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647169 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c34ace5f-42af-4058-9792-acfc9340252e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c49dcd-28aa-42f9-92fd-7e033cba7846-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647355 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pn6j\" (UniqueName: \"kubernetes.io/projected/8c4ea958-3175-42d0-9e0a-26225cada08a-kube-api-access-9pn6j\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-trusted-ca\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb5f7cc-918b-4506-bc59-ccbb930e763f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8pc2\" (UniqueName: \"kubernetes.io/projected/f919d5d0-c940-484d-9017-ada61860924a-kube-api-access-m8pc2\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647802 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-proxy-tls\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75845c6a-1b60-4846-a1ef-7e719c5ce398-config\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647859 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-config\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.647970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c9e052-a750-4b4f-873c-9808c0a3c75b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648169 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-config\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648200 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4twg4\" (UniqueName: \"kubernetes.io/projected/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-kube-api-access-4twg4\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2kq\" (UniqueName: \"kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34ace5f-42af-4058-9792-acfc9340252e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648445 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c49dcd-28aa-42f9-92fd-7e033cba7846-config\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-metrics-certs\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cacf2ed9-cf5d-4876-93d2-690ba449e153-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d83a32b-c8fc-438b-9797-664c5a2c5360-config\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648649 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: E1203 23:43:22.648708 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.148691787 +0000 UTC m=+138.910016198 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c34ace5f-42af-4058-9792-acfc9340252e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v967\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-kube-api-access-6v967\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsj5s\" (UniqueName: \"kubernetes.io/projected/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-kube-api-access-bsj5s\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcc2d\" (UniqueName: \"kubernetes.io/projected/e21b87e1-9839-4439-89ac-32f6a196774b-kube-api-access-dcc2d\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.648979 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649026 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-srv-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649110 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7hn\" (UniqueName: \"kubernetes.io/projected/8d83a32b-c8fc-438b-9797-664c5a2c5360-kube-api-access-ww7hn\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj9mt\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-kube-api-access-wj9mt\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649153 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vh9\" (UniqueName: \"kubernetes.io/projected/6669d173-9f6e-49d9-8159-5c0406eedac9-kube-api-access-42vh9\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxzk5\" (UniqueName: \"kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f919d5d0-c940-484d-9017-ada61860924a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46czq\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kglg6\" (UniqueName: \"kubernetes.io/projected/9111b8fb-b071-4986-b735-86d3a3a3322c-kube-api-access-kglg6\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22c49dcd-28aa-42f9-92fd-7e033cba7846-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f80dc8-d173-44a3-bab4-a3ae88319387-serving-cert\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cppst\" (UniqueName: \"kubernetes.io/projected/1eb5f7cc-918b-4506-bc59-ccbb930e763f-kube-api-access-cppst\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649546 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6669d173-9f6e-49d9-8159-5c0406eedac9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9kz7\" (UniqueName: \"kubernetes.io/projected/cacf2ed9-cf5d-4876-93d2-690ba449e153-kube-api-access-l9kz7\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649597 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-config\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-images\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-serving-cert\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.649698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9111b8fb-b071-4986-b735-86d3a3a3322c-serving-cert\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.650107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84w6l\" (UniqueName: \"kubernetes.io/projected/b8f80dc8-d173-44a3-bab4-a3ae88319387-kube-api-access-84w6l\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.650150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb5f7cc-918b-4506-bc59-ccbb930e763f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.650167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.650482 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs27m\" (UniqueName: \"kubernetes.io/projected/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-kube-api-access-rs27m\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.650539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751758 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-srv-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7hn\" (UniqueName: \"kubernetes.io/projected/8d83a32b-c8fc-438b-9797-664c5a2c5360-kube-api-access-ww7hn\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj9mt\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-kube-api-access-wj9mt\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-apiservice-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vh9\" (UniqueName: \"kubernetes.io/projected/6669d173-9f6e-49d9-8159-5c0406eedac9-kube-api-access-42vh9\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-metrics-tls\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f919d5d0-c940-484d-9017-ada61860924a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46czq\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxzk5\" (UniqueName: \"kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kglg6\" (UniqueName: \"kubernetes.io/projected/9111b8fb-b071-4986-b735-86d3a3a3322c-kube-api-access-kglg6\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751952 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-webhook-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.751983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f80dc8-d173-44a3-bab4-a3ae88319387-serving-cert\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.752068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cppst\" (UniqueName: \"kubernetes.io/projected/1eb5f7cc-918b-4506-bc59-ccbb930e763f-kube-api-access-cppst\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.752453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22c49dcd-28aa-42f9-92fd-7e033cba7846-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.752526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.752615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6669d173-9f6e-49d9-8159-5c0406eedac9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: E1203 23:43:22.752731 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.252684861 +0000 UTC m=+139.014009382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.752889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f919d5d0-c940-484d-9017-ada61860924a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.754738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.754769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.754838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-config\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.754928 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-images\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.755537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-config\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.755821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.755838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9kz7\" (UniqueName: \"kubernetes.io/projected/cacf2ed9-cf5d-4876-93d2-690ba449e153-kube-api-access-l9kz7\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9111b8fb-b071-4986-b735-86d3a3a3322c-serving-cert\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz55\" (UniqueName: \"kubernetes.io/projected/6042f2bb-d728-462a-a118-e2712e9a214f-kube-api-access-9wz55\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84w6l\" (UniqueName: \"kubernetes.io/projected/b8f80dc8-d173-44a3-bab4-a3ae88319387-kube-api-access-84w6l\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-serving-cert\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb5f7cc-918b-4506-bc59-ccbb930e763f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756203 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756233 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs27m\" (UniqueName: \"kubernetes.io/projected/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-kube-api-access-rs27m\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0b194a71-9d06-4186-8811-8668eaae38b0-tmpfs\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1a404e9-c110-4033-9ef0-52c3d6a28d87-cert\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756375 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9c9e052-a750-4b4f-873c-9808c0a3c75b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756433 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-client\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756480 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-socket-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-srv-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756592 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e21b87e1-9839-4439-89ac-32f6a196774b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-csi-data-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-profile-collector-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756671 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756699 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdzl\" (UniqueName: \"kubernetes.io/projected/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-kube-api-access-qsdzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756747 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756789 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jd6\" (UniqueName: \"kubernetes.io/projected/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-kube-api-access-d7jd6\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-images\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756879 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed746947-b1ba-426d-92a8-02db2a949e4b-service-ca-bundle\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.756986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75845c6a-1b60-4846-a1ef-7e719c5ce398-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-metrics-tls\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757154 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-mountpoint-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d83a32b-c8fc-438b-9797-664c5a2c5360-serving-cert\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-stats-auth\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8jw\" (UniqueName: \"kubernetes.io/projected/ed746947-b1ba-426d-92a8-02db2a949e4b-kube-api-access-8b8jw\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-service-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163c97f5-601b-47dd-8653-5dffdf157659-metrics-tls\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757385 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-service-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-certs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8f80dc8-d173-44a3-bab4-a3ae88319387-serving-cert\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.757439 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f919d5d0-c940-484d-9017-ada61860924a-proxy-tls\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.758256 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-config-volume\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.758326 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75845c6a-1b60-4846-a1ef-7e719c5ce398-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759083 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-service-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759514 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbgs\" (UniqueName: \"kubernetes.io/projected/ad151012-706a-4e5d-9565-70f905dd6a87-kube-api-access-fkbgs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759596 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpmwq\" (UniqueName: \"kubernetes.io/projected/d1a404e9-c110-4033-9ef0-52c3d6a28d87-kube-api-access-gpmwq\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.759744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.762651 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.763322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.763885 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.764017 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f919d5d0-c940-484d-9017-ada61860924a-proxy-tls\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.764213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-service-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.764298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9c9e052-a750-4b4f-873c-9808c0a3c75b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.765495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9111b8fb-b071-4986-b735-86d3a3a3322c-serving-cert\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163c97f5-601b-47dd-8653-5dffdf157659-trusted-ca\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766415 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6669d173-9f6e-49d9-8159-5c0406eedac9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed746947-b1ba-426d-92a8-02db2a949e4b-service-ca-bundle\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpw4j\" (UniqueName: \"kubernetes.io/projected/d7ae1964-7175-4a71-a02b-8ff8a274a5da-kube-api-access-lpw4j\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766605 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1eb5f7cc-918b-4506-bc59-ccbb930e763f-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-node-bootstrap-token\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.766962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.767057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770002 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770069 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-registration-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-default-certificate\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c34ace5f-42af-4058-9792-acfc9340252e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c49dcd-28aa-42f9-92fd-7e033cba7846-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-trusted-ca\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pn6j\" (UniqueName: \"kubernetes.io/projected/8c4ea958-3175-42d0-9e0a-26225cada08a-kube-api-access-9pn6j\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb5f7cc-918b-4506-bc59-ccbb930e763f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8pc2\" (UniqueName: \"kubernetes.io/projected/f919d5d0-c940-484d-9017-ada61860924a-kube-api-access-m8pc2\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8blt\" (UniqueName: \"kubernetes.io/projected/0b194a71-9d06-4186-8811-8668eaae38b0-kube-api-access-j8blt\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75845c6a-1b60-4846-a1ef-7e719c5ce398-config\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770394 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-config\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-proxy-tls\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c9e052-a750-4b4f-873c-9808c0a3c75b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4twg4\" (UniqueName: \"kubernetes.io/projected/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-kube-api-access-4twg4\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770530 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2kq\" (UniqueName: \"kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-config\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34ace5f-42af-4058-9792-acfc9340252e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c49dcd-28aa-42f9-92fd-7e033cba7846-config\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkjts\" (UniqueName: \"kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-metrics-certs\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770740 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cacf2ed9-cf5d-4876-93d2-690ba449e153-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d83a32b-c8fc-438b-9797-664c5a2c5360-config\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c34ace5f-42af-4058-9792-acfc9340252e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v967\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-kube-api-access-6v967\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsj5s\" (UniqueName: \"kubernetes.io/projected/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-kube-api-access-bsj5s\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcc2d\" (UniqueName: \"kubernetes.io/projected/e21b87e1-9839-4439-89ac-32f6a196774b-kube-api-access-dcc2d\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770903 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-plugins-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.771908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c49dcd-28aa-42f9-92fd-7e033cba7846-config\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.771965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-config\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.770458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.767137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.767406 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/163c97f5-601b-47dd-8653-5dffdf157659-trusted-ca\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.772459 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.772479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c34ace5f-42af-4058-9792-acfc9340252e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: E1203 23:43:22.772525 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.272508434 +0000 UTC m=+139.033832955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.772690 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-srv-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.767757 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.768084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/163c97f5-601b-47dd-8653-5dffdf157659-metrics-tls\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.773230 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d83a32b-c8fc-438b-9797-664c5a2c5360-serving-cert\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.773452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.773641 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-auth-proxy-config\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.773771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-ca\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.773934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.774576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1eb5f7cc-918b-4506-bc59-ccbb930e763f-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.774726 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8f80dc8-d173-44a3-bab4-a3ae88319387-config\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.768524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.775225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.768235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-srv-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.775433 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9111b8fb-b071-4986-b735-86d3a3a3322c-trusted-ca\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.775568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75845c6a-1b60-4846-a1ef-7e719c5ce398-config\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.776675 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/d7ae1964-7175-4a71-a02b-8ff8a274a5da-profile-collector-cert\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.776935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d83a32b-c8fc-438b-9797-664c5a2c5360-config\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.778080 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-serving-cert\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.777672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-metrics-certs\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.778705 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9c9e052-a750-4b4f-873c-9808c0a3c75b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.779476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.787356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-metrics-tls\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.787708 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8c4ea958-3175-42d0-9e0a-26225cada08a-etcd-client\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.787939 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/75845c6a-1b60-4846-a1ef-7e719c5ce398-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.788203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22c49dcd-28aa-42f9-92fd-7e033cba7846-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.791679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.788560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.788921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e21b87e1-9839-4439-89ac-32f6a196774b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789017 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-stats-auth\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cacf2ed9-cf5d-4876-93d2-690ba449e153-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-proxy-tls\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.789512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.788395 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.788457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ed746947-b1ba-426d-92a8-02db2a949e4b-default-certificate\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.792768 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c34ace5f-42af-4058-9792-acfc9340252e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.812156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cppst\" (UniqueName: \"kubernetes.io/projected/1eb5f7cc-918b-4506-bc59-ccbb930e763f-kube-api-access-cppst\") pod \"kube-storage-version-migrator-operator-b67b599dd-5lcxc\" (UID: \"1eb5f7cc-918b-4506-bc59-ccbb930e763f\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.813338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" event={"ID":"91d27877-3051-420c-a43e-f12be4e82450","Type":"ContainerStarted","Data":"6cb83dfa05ea24d04c8da548b9aefca53087c4677a026db017b26756c8a60d8f"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.819227 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxzk5\" (UniqueName: \"kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5\") pod \"oauth-openshift-558db77b4-cgmmp\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.828745 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l7xdw" event={"ID":"d26ed3c8-0bba-40a7-a18a-e8718b336dcc","Type":"ContainerStarted","Data":"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.828785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l7xdw" event={"ID":"d26ed3c8-0bba-40a7-a18a-e8718b336dcc","Type":"ContainerStarted","Data":"915e11b7b27fbb8e0b530162730f310121c20f41c833492cc0a94f37c6346c27"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.830764 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.836107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" event={"ID":"6590d259-1d9c-41e2-b070-5e7a1fa53d34","Type":"ContainerStarted","Data":"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.836143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" event={"ID":"6590d259-1d9c-41e2-b070-5e7a1fa53d34","Type":"ContainerStarted","Data":"3e7a95eb0c7c2d3d30c41e2fcdbecb1d7689d7cc86259105d2329bbc5d42d26d"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.836693 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.837386 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46czq\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.837571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" event={"ID":"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3","Type":"ContainerStarted","Data":"3900258f706853dd9d019f4974773added8019f45dd7d31fd3755f28c8e09ea0"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.837648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" event={"ID":"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3","Type":"ContainerStarted","Data":"f2129f62fabe2f4d6ee88a2dfce7791967e0c54cee14af6e78c3e16c3cc3afda"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.839040 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" event={"ID":"a44e8e46-19c5-4242-8186-12ec04167e59","Type":"ContainerStarted","Data":"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.839063 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" event={"ID":"a44e8e46-19c5-4242-8186-12ec04167e59","Type":"ContainerStarted","Data":"869d1daf7ebda525ee48ddfef0b28e75b085251bf882459bf88d630e49ac283f"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.839502 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.843173 4764 generic.go:334] "Generic (PLEG): container finished" podID="82bbef02-a9d4-42e3-a874-f702e232be80" containerID="0017cd32816fa48cd835bc0f586a58bc00bafd67eeb80d27192932f01be8d19e" exitCode=0 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.843273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" event={"ID":"82bbef02-a9d4-42e3-a874-f702e232be80","Type":"ContainerDied","Data":"0017cd32816fa48cd835bc0f586a58bc00bafd67eeb80d27192932f01be8d19e"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.843305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" event={"ID":"82bbef02-a9d4-42e3-a874-f702e232be80","Type":"ContainerStarted","Data":"a684d88244b2f92c1b9a9f1be9b2fb35428ae54f9b80b9407dfbfff78d49e2e2"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.848594 4764 generic.go:334] "Generic (PLEG): container finished" podID="5275f70a-3cd1-4955-8e28-6027b725376d" containerID="30131afcacd50e4b9628ad1d39e9a0a4cf090210915858980d19f155a5a65df4" exitCode=0 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.848632 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" event={"ID":"5275f70a-3cd1-4955-8e28-6027b725376d","Type":"ContainerDied","Data":"30131afcacd50e4b9628ad1d39e9a0a4cf090210915858980d19f155a5a65df4"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.848659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" event={"ID":"5275f70a-3cd1-4955-8e28-6027b725376d","Type":"ContainerStarted","Data":"2a64ef955e781c14d2adb12a600d794704a3133decd33abfd46175ef078c71d6"} Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.853728 4764 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bn7zt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.853767 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.857580 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vlqkp"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.861147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vh9\" (UniqueName: \"kubernetes.io/projected/6669d173-9f6e-49d9-8159-5c0406eedac9-kube-api-access-42vh9\") pod \"control-plane-machine-set-operator-78cbb6b69f-fnct7\" (UID: \"6669d173-9f6e-49d9-8159-5c0406eedac9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.871981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872047 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-apiservice-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-metrics-tls\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-webhook-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz55\" (UniqueName: \"kubernetes.io/projected/6042f2bb-d728-462a-a118-e2712e9a214f-kube-api-access-9wz55\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0b194a71-9d06-4186-8811-8668eaae38b0-tmpfs\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1a404e9-c110-4033-9ef0-52c3d6a28d87-cert\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-socket-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-csi-data-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jd6\" (UniqueName: \"kubernetes.io/projected/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-kube-api-access-d7jd6\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872704 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-mountpoint-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-certs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872845 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-config-volume\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbgs\" (UniqueName: \"kubernetes.io/projected/ad151012-706a-4e5d-9565-70f905dd6a87-kube-api-access-fkbgs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpmwq\" (UniqueName: \"kubernetes.io/projected/d1a404e9-c110-4033-9ef0-52c3d6a28d87-kube-api-access-gpmwq\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.872974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-node-bootstrap-token\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873062 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-registration-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8blt\" (UniqueName: \"kubernetes.io/projected/0b194a71-9d06-4186-8811-8668eaae38b0-kube-api-access-j8blt\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkjts\" (UniqueName: \"kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-plugins-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873858 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-plugins-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.873925 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-socket-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.874526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-csi-data-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.874860 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-metrics-tls\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: E1203 23:43:22.874937 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.374922392 +0000 UTC m=+139.136246803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.875047 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-mountpoint-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.876893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-config-volume\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.877125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.877316 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6042f2bb-d728-462a-a118-e2712e9a214f-registration-dir\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.877688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/0b194a71-9d06-4186-8811-8668eaae38b0-tmpfs\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: W1203 23:43:22.877733 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a5d117_9aa6_4b48_8862_2be01934454a.slice/crio-7365748f8e30a34a84dbb3a21306dcb7a640c9c9c318074977a83f5ca0d3f018 WatchSource:0}: Error finding container 7365748f8e30a34a84dbb3a21306dcb7a640c9c9c318074977a83f5ca0d3f018: Status 404 returned error can't find the container with id 7365748f8e30a34a84dbb3a21306dcb7a640c9c9c318074977a83f5ca0d3f018 Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.879342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1a404e9-c110-4033-9ef0-52c3d6a28d87-cert\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.880175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kglg6\" (UniqueName: \"kubernetes.io/projected/9111b8fb-b071-4986-b735-86d3a3a3322c-kube-api-access-kglg6\") pod \"console-operator-58897d9998-mh8l6\" (UID: \"9111b8fb-b071-4986-b735-86d3a3a3322c\") " pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.880442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-certs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.882288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-apiservice-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.882317 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0b194a71-9d06-4186-8811-8668eaae38b0-webhook-cert\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.892247 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.893036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ad151012-706a-4e5d-9565-70f905dd6a87-node-bootstrap-token\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.897276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7hn\" (UniqueName: \"kubernetes.io/projected/8d83a32b-c8fc-438b-9797-664c5a2c5360-kube-api-access-ww7hn\") pod \"service-ca-operator-777779d784-tw2qg\" (UID: \"8d83a32b-c8fc-438b-9797-664c5a2c5360\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.900142 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.904084 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-f2nwm"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.919362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22c49dcd-28aa-42f9-92fd-7e033cba7846-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-pc59h\" (UID: \"22c49dcd-28aa-42f9-92fd-7e033cba7846\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.924704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.963551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj9mt\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-kube-api-access-wj9mt\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.970617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9kz7\" (UniqueName: \"kubernetes.io/projected/cacf2ed9-cf5d-4876-93d2-690ba449e153-kube-api-access-l9kz7\") pod \"package-server-manager-789f6589d5-kqhxc\" (UID: \"cacf2ed9-cf5d-4876-93d2-690ba449e153\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.972358 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl"] Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.972465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.975204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:22 crc kubenswrapper[4764]: E1203 23:43:22.976994 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.476981661 +0000 UTC m=+139.238306072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:22 crc kubenswrapper[4764]: I1203 23:43:22.996373 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84w6l\" (UniqueName: \"kubernetes.io/projected/b8f80dc8-d173-44a3-bab4-a3ae88319387-kube-api-access-84w6l\") pod \"authentication-operator-69f744f599-4js5f\" (UID: \"b8f80dc8-d173-44a3-bab4-a3ae88319387\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.009959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs27m\" (UniqueName: \"kubernetes.io/projected/8c9c7d58-1adb-44f7-9a81-f08161a61c6a-kube-api-access-rs27m\") pod \"catalog-operator-68c6474976-qb6kg\" (UID: \"8c9c7d58-1adb-44f7-9a81-f08161a61c6a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.029769 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.038354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.038601 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.042410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdzl\" (UniqueName: \"kubernetes.io/projected/74e0227d-555c-4d47-8ec0-8b3a6ef73d74-kube-api-access-qsdzl\") pod \"openshift-controller-manager-operator-756b6f6bc6-9px68\" (UID: \"74e0227d-555c-4d47-8ec0-8b3a6ef73d74\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.057258 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.059100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8jw\" (UniqueName: \"kubernetes.io/projected/ed746947-b1ba-426d-92a8-02db2a949e4b-kube-api-access-8b8jw\") pod \"router-default-5444994796-wxtcz\" (UID: \"ed746947-b1ba-426d-92a8-02db2a949e4b\") " pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.059676 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.087965 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.088866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.089320 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.589305042 +0000 UTC m=+139.350629453 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.114094 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tl5lg"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.123225 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.134264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/75845c6a-1b60-4846-a1ef-7e719c5ce398-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qxbf2\" (UID: \"75845c6a-1b60-4846-a1ef-7e719c5ce398\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.139900 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.145268 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpw4j\" (UniqueName: \"kubernetes.io/projected/d7ae1964-7175-4a71-a02b-8ff8a274a5da-kube-api-access-lpw4j\") pod \"olm-operator-6b444d44fb-62qv5\" (UID: \"d7ae1964-7175-4a71-a02b-8ff8a274a5da\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.152150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.164540 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.168215 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-fzmrn"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.170563 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34ace5f-42af-4058-9792-acfc9340252e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v245m\" (UID: \"c34ace5f-42af-4058-9792-acfc9340252e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.188450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9c9e052-a750-4b4f-873c-9808c0a3c75b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-phcd4\" (UID: \"b9c9e052-a750-4b4f-873c-9808c0a3c75b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.190171 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.191071 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.191367 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.69135352 +0000 UTC m=+139.452677931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.197726 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.198210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4twg4\" (UniqueName: \"kubernetes.io/projected/eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb-kube-api-access-4twg4\") pod \"machine-config-operator-74547568cd-vhrb8\" (UID: \"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.219518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2kq\" (UniqueName: \"kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq\") pod \"marketplace-operator-79b997595-sws9j\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.235195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.243972 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.245758 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v967\" (UniqueName: \"kubernetes.io/projected/163c97f5-601b-47dd-8653-5dffdf157659-kube-api-access-6v967\") pod \"ingress-operator-5b745b69d9-7tgwq\" (UID: \"163c97f5-601b-47dd-8653-5dffdf157659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:23 crc kubenswrapper[4764]: W1203 23:43:23.260953 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a14adc_ebea_42b0_bd10_1a82d58d7c0e.slice/crio-cdd111d61540e95ec871f8717b328f80c4aada82a9e4dfb81b62ce8d21777e2e WatchSource:0}: Error finding container cdd111d61540e95ec871f8717b328f80c4aada82a9e4dfb81b62ce8d21777e2e: Status 404 returned error can't find the container with id cdd111d61540e95ec871f8717b328f80c4aada82a9e4dfb81b62ce8d21777e2e Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.263010 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsj5s\" (UniqueName: \"kubernetes.io/projected/4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0-kube-api-access-bsj5s\") pod \"dns-operator-744455d44c-glq4z\" (UID: \"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0\") " pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.280546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcc2d\" (UniqueName: \"kubernetes.io/projected/e21b87e1-9839-4439-89ac-32f6a196774b-kube-api-access-dcc2d\") pod \"multus-admission-controller-857f4d67dd-66kxp\" (UID: \"e21b87e1-9839-4439-89ac-32f6a196774b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.282501 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.292029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.292349 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.792333878 +0000 UTC m=+139.553658289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.295019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.306524 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pn6j\" (UniqueName: \"kubernetes.io/projected/8c4ea958-3175-42d0-9e0a-26225cada08a-kube-api-access-9pn6j\") pod \"etcd-operator-b45778765-8wjsw\" (UID: \"8c4ea958-3175-42d0-9e0a-26225cada08a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.338479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jd6\" (UniqueName: \"kubernetes.io/projected/f30cc0ba-74c2-44f7-adc3-9f80cb4dac82-kube-api-access-d7jd6\") pod \"dns-default-2lqqs\" (UID: \"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82\") " pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.368063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz55\" (UniqueName: \"kubernetes.io/projected/6042f2bb-d728-462a-a118-e2712e9a214f-kube-api-access-9wz55\") pod \"csi-hostpathplugin-sm25h\" (UID: \"6042f2bb-d728-462a-a118-e2712e9a214f\") " pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.368379 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.374869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbgs\" (UniqueName: \"kubernetes.io/projected/ad151012-706a-4e5d-9565-70f905dd6a87-kube-api-access-fkbgs\") pod \"machine-config-server-d7hth\" (UID: \"ad151012-706a-4e5d-9565-70f905dd6a87\") " pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.376574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.393866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.394144 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.89413357 +0000 UTC m=+139.655457981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.403201 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.404361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpmwq\" (UniqueName: \"kubernetes.io/projected/d1a404e9-c110-4033-9ef0-52c3d6a28d87-kube-api-access-gpmwq\") pod \"ingress-canary-tw5p6\" (UID: \"d1a404e9-c110-4033-9ef0-52c3d6a28d87\") " pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.418448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkjts\" (UniqueName: \"kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts\") pod \"collect-profiles-29413410-zrhh8\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.437004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8pc2\" (UniqueName: \"kubernetes.io/projected/f919d5d0-c940-484d-9017-ada61860924a-kube-api-access-m8pc2\") pod \"machine-config-controller-84d6567774-qr6vp\" (UID: \"f919d5d0-c940-484d-9017-ada61860924a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.444608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8blt\" (UniqueName: \"kubernetes.io/projected/0b194a71-9d06-4186-8811-8668eaae38b0-kube-api-access-j8blt\") pod \"packageserver-d55dfcdfc-5bbgm\" (UID: \"0b194a71-9d06-4186-8811-8668eaae38b0\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.444959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.454168 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.480983 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.497373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.498232 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.998215556 +0000 UTC m=+139.759539967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.498285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.498564 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:23.998557606 +0000 UTC m=+139.759882007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.519585 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4js5f"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.519630 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.547234 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.557687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.559431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.569814 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.576858 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cgmmp"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.580508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.588077 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.595525 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tw5p6" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.598843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.599001 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.098977418 +0000 UTC m=+139.860301829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.599079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.599610 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.099603465 +0000 UTC m=+139.860927877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.607599 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-d7hth" Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.704707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.706636 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.206614195 +0000 UTC m=+139.967938626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.715978 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.736006 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.763363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-mh8l6"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.771164 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.810088 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.810653 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.31063723 +0000 UTC m=+140.071961641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: W1203 23:43:23.820216 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22c49dcd_28aa_42f9_92fd_7e033cba7846.slice/crio-64e7dd9a32a9362402672d05147bdcfdcb77c7f63b305460107bfc35ee259f71 WatchSource:0}: Error finding container 64e7dd9a32a9362402672d05147bdcfdcb77c7f63b305460107bfc35ee259f71: Status 404 returned error can't find the container with id 64e7dd9a32a9362402672d05147bdcfdcb77c7f63b305460107bfc35ee259f71 Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.822460 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.868225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" event={"ID":"1eb5f7cc-918b-4506-bc59-ccbb930e763f","Type":"ContainerStarted","Data":"0574dbec36fbfdf297a9078f0613aacab18f9f348292728a3014291cc6b2b3da"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.880348 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.895702 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc"] Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.903024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" event={"ID":"22c49dcd-28aa-42f9-92fd-7e033cba7846","Type":"ContainerStarted","Data":"64e7dd9a32a9362402672d05147bdcfdcb77c7f63b305460107bfc35ee259f71"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.915903 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:23 crc kubenswrapper[4764]: E1203 23:43:23.918697 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.418678348 +0000 UTC m=+140.180002759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.925579 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" event={"ID":"d4a5d117-9aa6-4b48-8862-2be01934454a","Type":"ContainerStarted","Data":"0003eb2393dba9f6345c7c2f195a72a43f41cd6f425eb89d3264dba42b3304cb"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.925615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" event={"ID":"d4a5d117-9aa6-4b48-8862-2be01934454a","Type":"ContainerStarted","Data":"7365748f8e30a34a84dbb3a21306dcb7a640c9c9c318074977a83f5ca0d3f018"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.927078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" event={"ID":"fb55c6e2-2206-4735-ba1a-bfbff1e7549a","Type":"ContainerStarted","Data":"34e3d70bbea3f28b74aeef6968faaa94a4913a4b389838820813fbd8beb94a8d"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.931204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" event={"ID":"d7ae1964-7175-4a71-a02b-8ff8a274a5da","Type":"ContainerStarted","Data":"d462cc152003e3364ddefcf3aa9395f600bf5fd147bb1e09eb04c3dada984f43"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.934475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wxtcz" event={"ID":"ed746947-b1ba-426d-92a8-02db2a949e4b","Type":"ContainerStarted","Data":"c7034f10ee25149b682132870a58f1a085c7ac2fefafeda4e157c86eb8d2a064"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.936575 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f2nwm" event={"ID":"d64b774a-681b-4c0f-b2eb-36398275e451","Type":"ContainerStarted","Data":"e3a35cdbdb028f6991070a3e6421d65a41873337a1449d8876efc785046a671d"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.936654 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-f2nwm" event={"ID":"d64b774a-681b-4c0f-b2eb-36398275e451","Type":"ContainerStarted","Data":"9e749cc321a516ebe5a94b7ce00962bcdafc93bc1320a7ea4436a23d2eae5f34"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.940543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" event={"ID":"91d27877-3051-420c-a43e-f12be4e82450","Type":"ContainerStarted","Data":"8192a24d8f3e6527870e936584171ecd5d7e01d432ea4b3880b8d5812bdc806a"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.951572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerStarted","Data":"dacc0af9bf867e357193602457d7abe033b3fcd97dfc9660b291a27ac8683003"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.954235 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" event={"ID":"9111b8fb-b071-4986-b735-86d3a3a3322c","Type":"ContainerStarted","Data":"846fb33dc5c088dbc855cb9bf1c0186c3aefa12e96639665434f35d4bd615adf"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.972311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" event={"ID":"72a14adc-ebea-42b0-bd10-1a82d58d7c0e","Type":"ContainerStarted","Data":"cdd111d61540e95ec871f8717b328f80c4aada82a9e4dfb81b62ce8d21777e2e"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.977930 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" event={"ID":"74e0227d-555c-4d47-8ec0-8b3a6ef73d74","Type":"ContainerStarted","Data":"f02300743a218e47f2fb679996221d5d43f6c998510118e9eb2c1a037fb1d143"} Dec 03 23:43:23 crc kubenswrapper[4764]: I1203 23:43:23.988676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" event={"ID":"5baf5265-7d93-41ff-a1a9-1cdeda3e38f3","Type":"ContainerStarted","Data":"fab884be4f0a351f0193f6467642c93ba3cc646fc8299000394c11566e9a6e88"} Dec 03 23:43:23 crc kubenswrapper[4764]: W1203 23:43:23.988762 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c9c7d58_1adb_44f7_9a81_f08161a61c6a.slice/crio-2c0bd05303faef22571ca73eaf96646a0b4fb8ff3f5afed621eca569d9d0b09f WatchSource:0}: Error finding container 2c0bd05303faef22571ca73eaf96646a0b4fb8ff3f5afed621eca569d9d0b09f: Status 404 returned error can't find the container with id 2c0bd05303faef22571ca73eaf96646a0b4fb8ff3f5afed621eca569d9d0b09f Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.019634 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" event={"ID":"82bbef02-a9d4-42e3-a874-f702e232be80","Type":"ContainerStarted","Data":"c6f1579e642273818dd2c50742013ac2efeaba885156a7dc5939ddf03e5c9cc7"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.020265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.020808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.021077 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.521063816 +0000 UTC m=+140.282388227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.050205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" event={"ID":"8d83a32b-c8fc-438b-9797-664c5a2c5360","Type":"ContainerStarted","Data":"cc0ab6e63d799062482452f8a783f40c6ff116ca5d48b39fef7a23ef1956a61a"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.050825 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-glq4z"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.054058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" event={"ID":"e8324e0f-a676-4015-bbcb-cf68235eb72a","Type":"ContainerStarted","Data":"29e4329c4d4d2a8ad5bb5eba3b3583ae70ce5328f0d4cdf070fe6c79f2a8edb0"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.057351 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" event={"ID":"6669d173-9f6e-49d9-8159-5c0406eedac9","Type":"ContainerStarted","Data":"b832e9aef1d1eb8af97e34f3c3ff40621df086b12a546ea263d24c43b956a594"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.060770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" event={"ID":"0af14017-eba4-401e-9f16-d4a7fd22b6b8","Type":"ContainerStarted","Data":"1687035745bd126ff0997764c781d46c2499cac6111e9ba298354d2cc0db8d37"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.060832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" event={"ID":"0af14017-eba4-401e-9f16-d4a7fd22b6b8","Type":"ContainerStarted","Data":"1fb0ad38f06c12e04823b36d9d8cc5020f0e27fc9e7ae86e7c77ea01411cafc4"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.062677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" event={"ID":"28c57be9-2500-4944-abed-6fe2e4e2dd0d","Type":"ContainerStarted","Data":"9b82c055e222596b7ff55f2ee550ae1d19c1fc97d281fe5d115833d167ee5196"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.066408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" event={"ID":"b8f80dc8-d173-44a3-bab4-a3ae88319387","Type":"ContainerStarted","Data":"a0338043f7e481ac6f9d8f044ba86a27c06680a205a2197ad72f5d2e7aedd608"} Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.074777 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.124545 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.125476 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.62545757 +0000 UTC m=+140.386781981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.142132 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.225929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.227590 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.727578191 +0000 UTC m=+140.488902602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.327665 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.327876 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.827819748 +0000 UTC m=+140.589144159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.327997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.329039 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.829021942 +0000 UTC m=+140.590346353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.436681 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.437012 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:24.936996229 +0000 UTC m=+140.698320630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.446053 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wjsw"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.474016 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-66kxp"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.477918 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.552165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.552572 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.052558461 +0000 UTC m=+140.813882872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.653227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.656959 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.156932356 +0000 UTC m=+140.918256767 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.658582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.659955 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.159939261 +0000 UTC m=+140.921263672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.672819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.674605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.694306 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.759187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.759847 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.259828888 +0000 UTC m=+141.021153299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.767422 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" podStartSLOduration=121.767407454 podStartE2EDuration="2m1.767407454s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:24.767004152 +0000 UTC m=+140.528328563" watchObservedRunningTime="2025-12-03 23:43:24.767407454 +0000 UTC m=+140.528731865" Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.822976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tw5p6"] Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.868637 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-l7xdw" podStartSLOduration=121.868620228 podStartE2EDuration="2m1.868620228s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:24.848670672 +0000 UTC m=+140.609995083" watchObservedRunningTime="2025-12-03 23:43:24.868620228 +0000 UTC m=+140.629944639" Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.869970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.870256 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.370245434 +0000 UTC m=+141.131569845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.909231 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" podStartSLOduration=121.909215391 podStartE2EDuration="2m1.909215391s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:24.909150399 +0000 UTC m=+140.670474810" watchObservedRunningTime="2025-12-03 23:43:24.909215391 +0000 UTC m=+140.670539802" Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.936264 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kqj2v" podStartSLOduration=121.936248529 podStartE2EDuration="2m1.936248529s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:24.934686285 +0000 UTC m=+140.696010696" watchObservedRunningTime="2025-12-03 23:43:24.936248529 +0000 UTC m=+140.697572930" Dec 03 23:43:24 crc kubenswrapper[4764]: W1203 23:43:24.964500 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf919d5d0_c940_484d_9017_ada61860924a.slice/crio-399e51bb14a427c406decb8227251cf85c372fc447d8ea0b7667d398b6644ecc WatchSource:0}: Error finding container 399e51bb14a427c406decb8227251cf85c372fc447d8ea0b7667d398b6644ecc: Status 404 returned error can't find the container with id 399e51bb14a427c406decb8227251cf85c372fc447d8ea0b7667d398b6644ecc Dec 03 23:43:24 crc kubenswrapper[4764]: I1203 23:43:24.972428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:24 crc kubenswrapper[4764]: E1203 23:43:24.972756 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.472731565 +0000 UTC m=+141.234055976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:24 crc kubenswrapper[4764]: W1203 23:43:24.977957 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a404e9_c110_4033_9ef0_52c3d6a28d87.slice/crio-cc5b820f484b7528df294a4d748f6ff11a192ec8cd7fbb0c8cb6a163b29ee0af WatchSource:0}: Error finding container cc5b820f484b7528df294a4d748f6ff11a192ec8cd7fbb0c8cb6a163b29ee0af: Status 404 returned error can't find the container with id cc5b820f484b7528df294a4d748f6ff11a192ec8cd7fbb0c8cb6a163b29ee0af Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.032352 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8"] Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.044555 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm"] Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.047813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" podStartSLOduration=121.047792227 podStartE2EDuration="2m1.047792227s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:25.040112539 +0000 UTC m=+140.801436950" watchObservedRunningTime="2025-12-03 23:43:25.047792227 +0000 UTC m=+140.809116638" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.065444 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-sm25h"] Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.110338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.110932 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.610861609 +0000 UTC m=+141.372186020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.133271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" event={"ID":"75845c6a-1b60-4846-a1ef-7e719c5ce398","Type":"ContainerStarted","Data":"09188ea87dcf65669302fd18e28a6cd5ef9cb8284c2274b311fb81587800fcdb"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.180010 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kxpmm" podStartSLOduration=122.179992222 podStartE2EDuration="2m2.179992222s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:25.107773551 +0000 UTC m=+140.869097962" watchObservedRunningTime="2025-12-03 23:43:25.179992222 +0000 UTC m=+140.941316633" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.182359 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2lqqs"] Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.194370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" event={"ID":"d4a5d117-9aa6-4b48-8862-2be01934454a","Type":"ContainerStarted","Data":"653c303f2465a2707f2819a09153c8c4958abff82cbcde850cb9d1b1254d7661"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.206338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" event={"ID":"f919d5d0-c940-484d-9017-ada61860924a","Type":"ContainerStarted","Data":"399e51bb14a427c406decb8227251cf85c372fc447d8ea0b7667d398b6644ecc"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.213011 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.213770 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.713751841 +0000 UTC m=+141.475076252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.215582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" event={"ID":"b9c9e052-a750-4b4f-873c-9808c0a3c75b","Type":"ContainerStarted","Data":"65e6c859b77c75ff6334b641007f64c324cb9c3d4b152a13173ff8b977b2ed76"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.217489 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" event={"ID":"c34ace5f-42af-4058-9792-acfc9340252e","Type":"ContainerStarted","Data":"9108f13180cca1ab3d3308ce9e618330e15b278c5601215b4a7946062bd3e062"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.262044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" event={"ID":"fb55c6e2-2206-4735-ba1a-bfbff1e7549a","Type":"ContainerStarted","Data":"9a8c2b34543652e885405398ffd9cf126d586faabf0e4c4871e57ef904270d99"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.297169 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" event={"ID":"6669d173-9f6e-49d9-8159-5c0406eedac9","Type":"ContainerStarted","Data":"bca7467fa5d69724484bcf43407bf087ab6c87605d67be665fed19fdbc1adc65"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.314498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.315635 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.815618534 +0000 UTC m=+141.576942945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.367655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wxtcz" event={"ID":"ed746947-b1ba-426d-92a8-02db2a949e4b","Type":"ContainerStarted","Data":"b55714632ee2c0ff2f2aff32e129c8a49bbd16008a8793700846704f38b9a905"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.405990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" event={"ID":"5275f70a-3cd1-4955-8e28-6027b725376d","Type":"ContainerStarted","Data":"2b3ccf10a6d422fa17c7965d820def34c8aac29a8aa94dc7bd34bf206277c27f"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.415276 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.415670 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:25.915650066 +0000 UTC m=+141.676974477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.447088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" event={"ID":"9111b8fb-b071-4986-b735-86d3a3a3322c","Type":"ContainerStarted","Data":"5951cc0d4837978160694a1b522feee4067e138fa7a38eb5173d8cfe0c4e156c"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.448213 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.457066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" event={"ID":"e21b87e1-9839-4439-89ac-32f6a196774b","Type":"ContainerStarted","Data":"99992991ccf5c83cc1e8122979aad357fbb3cdb9f21a8cc83f6cec6ffad0ed60"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.459015 4764 patch_prober.go:28] interesting pod/console-operator-58897d9998-mh8l6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.459056 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" podUID="9111b8fb-b071-4986-b735-86d3a3a3322c" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.469548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" event={"ID":"cacf2ed9-cf5d-4876-93d2-690ba449e153","Type":"ContainerStarted","Data":"f5548861e4576cfc265876488186cfb3d0faa7f389dc7fab4bf635f99e188ed4"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.469590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" event={"ID":"cacf2ed9-cf5d-4876-93d2-690ba449e153","Type":"ContainerStarted","Data":"cc0282431006ab2416aab22eeaad57d6cc3b97d9ad2d80cb0f8196415b0c498b"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.484874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tw5p6" event={"ID":"d1a404e9-c110-4033-9ef0-52c3d6a28d87","Type":"ContainerStarted","Data":"cc5b820f484b7528df294a4d748f6ff11a192ec8cd7fbb0c8cb6a163b29ee0af"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.515570 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" event={"ID":"d7ae1964-7175-4a71-a02b-8ff8a274a5da","Type":"ContainerStarted","Data":"a43e598cc40ce0c86036915ede2857528223dedceb7f9781da2940e2cc967b50"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.516872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.517332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.519373 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.019344731 +0000 UTC m=+141.780669142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.534074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" event={"ID":"8c9c7d58-1adb-44f7-9a81-f08161a61c6a","Type":"ContainerStarted","Data":"eff75fde11875bd68d8b6dbe0f9ad233a37eccc66704af2a6250aeacc031c17b"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.534124 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" event={"ID":"8c9c7d58-1adb-44f7-9a81-f08161a61c6a","Type":"ContainerStarted","Data":"2c0bd05303faef22571ca73eaf96646a0b4fb8ff3f5afed621eca569d9d0b09f"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.534663 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.549532 4764 generic.go:334] "Generic (PLEG): container finished" podID="e8324e0f-a676-4015-bbcb-cf68235eb72a" containerID="df1f963c2dcd4215ef74baf46aa1ae18883c1c378e46f1f564d144f2e83176c4" exitCode=0 Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.549792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" event={"ID":"e8324e0f-a676-4015-bbcb-cf68235eb72a","Type":"ContainerDied","Data":"df1f963c2dcd4215ef74baf46aa1ae18883c1c378e46f1f564d144f2e83176c4"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.557556 4764 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qb6kg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.557595 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" podUID="8c9c7d58-1adb-44f7-9a81-f08161a61c6a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.563209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d7hth" event={"ID":"ad151012-706a-4e5d-9565-70f905dd6a87","Type":"ContainerStarted","Data":"fb6bcc295d57ed905c4a8111326a4bcd5d6f7f4d5fb34d42f03479530bc02f57"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.573769 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.595385 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" event={"ID":"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb","Type":"ContainerStarted","Data":"58a331fa1dc42f538ed3cdee051c5f022acabf7f55cd5cb645dc7d4f226105e1"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.614786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" event={"ID":"163c97f5-601b-47dd-8653-5dffdf157659","Type":"ContainerStarted","Data":"1400e08b267b3e16e0e2a665996503bc825384efb7bfd993df0d2bb9b5a930f6"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.621148 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.622113 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.122097679 +0000 UTC m=+141.883422090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.633740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" event={"ID":"b8f80dc8-d173-44a3-bab4-a3ae88319387","Type":"ContainerStarted","Data":"3f70fb4840aef848c29938a642616d90550b337ebb7b04febc9b490081e792fc"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.648583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" event={"ID":"1eb5f7cc-918b-4506-bc59-ccbb930e763f","Type":"ContainerStarted","Data":"3f547d4e47fd474ae357e0a7dd1ed44f2ec89ab1727add41cd792e5b10d9be48"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.655492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" event={"ID":"74e0227d-555c-4d47-8ec0-8b3a6ef73d74","Type":"ContainerStarted","Data":"2ad1aeb08619117972dea1b12cbf06ef2e1bc92883263588610a39b3cfb0f773"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.708015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerStarted","Data":"96b52f286694278c724a4e1484a8a84187fcc67c763f8c345700208209105ba3"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.710286 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.727545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.738459 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.238443244 +0000 UTC m=+141.999767655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.753931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" event={"ID":"8c4ea958-3175-42d0-9e0a-26225cada08a","Type":"ContainerStarted","Data":"b102947e3208570cc4f4e8a9436f81275bfa1cca510ecb1ed504e1d7414ab31e"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.779145 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sws9j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.779192 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.815292 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" event={"ID":"72a14adc-ebea-42b0-bd10-1a82d58d7c0e","Type":"ContainerStarted","Data":"0dc55d933131c1665cd5214c421dd43d7690df09e475e516eee2b4da99c50a43"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.829292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.829648 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.329621324 +0000 UTC m=+142.090945735 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.863357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" event={"ID":"8d83a32b-c8fc-438b-9797-664c5a2c5360","Type":"ContainerStarted","Data":"7654a070e70608449a44cd80a68884c9c3b115e7400e5cb14555f67060763998"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.885538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" event={"ID":"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0","Type":"ContainerStarted","Data":"502547b41da86696413c4c3d61b03f6224d06556772985cdca074bb67af51d5d"} Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.886902 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.905638 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-f2nwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.905684 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f2nwm" podUID="d64b774a-681b-4c0f-b2eb-36398275e451" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.937543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:25 crc kubenswrapper[4764]: E1203 23:43:25.938798 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.438784634 +0000 UTC m=+142.200109045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:25 crc kubenswrapper[4764]: I1203 23:43:25.998619 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4js5f" podStartSLOduration=122.998595623 podStartE2EDuration="2m2.998595623s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:25.955343374 +0000 UTC m=+141.716667785" watchObservedRunningTime="2025-12-03 23:43:25.998595623 +0000 UTC m=+141.759920034" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.002232 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-d7hth" podStartSLOduration=6.002223526 podStartE2EDuration="6.002223526s" podCreationTimestamp="2025-12-03 23:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.001240228 +0000 UTC m=+141.762564629" watchObservedRunningTime="2025-12-03 23:43:26.002223526 +0000 UTC m=+141.763547937" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.038572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.040020 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.539986899 +0000 UTC m=+142.301311370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.056981 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fnct7" podStartSLOduration=123.056963731 podStartE2EDuration="2m3.056963731s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.051882656 +0000 UTC m=+141.813207077" watchObservedRunningTime="2025-12-03 23:43:26.056963731 +0000 UTC m=+141.818288142" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.097283 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.105493 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:26 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:26 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:26 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.105533 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.145916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.146442 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.646431302 +0000 UTC m=+142.407755713 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.246393 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vlqkp" podStartSLOduration=123.24637127 podStartE2EDuration="2m3.24637127s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.170063733 +0000 UTC m=+141.931388144" watchObservedRunningTime="2025-12-03 23:43:26.24637127 +0000 UTC m=+142.007695681" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.247111 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-62qv5" podStartSLOduration=122.247106541 podStartE2EDuration="2m2.247106541s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.224622383 +0000 UTC m=+141.985946794" watchObservedRunningTime="2025-12-03 23:43:26.247106541 +0000 UTC m=+142.008430952" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.247327 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.247685 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.747670867 +0000 UTC m=+142.508995278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.263152 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wxtcz" podStartSLOduration=123.263139777 podStartE2EDuration="2m3.263139777s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.260332997 +0000 UTC m=+142.021657408" watchObservedRunningTime="2025-12-03 23:43:26.263139777 +0000 UTC m=+142.024464188" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.322767 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-fzmrn" podStartSLOduration=122.32274264 podStartE2EDuration="2m2.32274264s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.320425754 +0000 UTC m=+142.081750165" watchObservedRunningTime="2025-12-03 23:43:26.32274264 +0000 UTC m=+142.084067051" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.351556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.351879 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.851862337 +0000 UTC m=+142.613186748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.379427 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" podStartSLOduration=122.379412479 podStartE2EDuration="2m2.379412479s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.377799233 +0000 UTC m=+142.139123644" watchObservedRunningTime="2025-12-03 23:43:26.379412479 +0000 UTC m=+142.140736890" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.457199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.457492 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:26.957475586 +0000 UTC m=+142.718799997 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.504108 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" podStartSLOduration=122.504092 podStartE2EDuration="2m2.504092s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.456434587 +0000 UTC m=+142.217758998" watchObservedRunningTime="2025-12-03 23:43:26.504092 +0000 UTC m=+142.265416411" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.505143 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" podStartSLOduration=123.50513835 podStartE2EDuration="2m3.50513835s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.502876746 +0000 UTC m=+142.264201157" watchObservedRunningTime="2025-12-03 23:43:26.50513835 +0000 UTC m=+142.266462761" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.549768 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tw2qg" podStartSLOduration=122.549754397 podStartE2EDuration="2m2.549754397s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.548468121 +0000 UTC m=+142.309792532" watchObservedRunningTime="2025-12-03 23:43:26.549754397 +0000 UTC m=+142.311078808" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.561341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.561908 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.061897922 +0000 UTC m=+142.823222333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.665223 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.665665 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.165648979 +0000 UTC m=+142.926973390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.683331 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" podStartSLOduration=122.683312931 podStartE2EDuration="2m2.683312931s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.655991545 +0000 UTC m=+142.417315956" watchObservedRunningTime="2025-12-03 23:43:26.683312931 +0000 UTC m=+142.444637342" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.683560 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-9px68" podStartSLOduration=123.683556908 podStartE2EDuration="2m3.683556908s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.682005284 +0000 UTC m=+142.443329695" watchObservedRunningTime="2025-12-03 23:43:26.683556908 +0000 UTC m=+142.444881319" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.713318 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5lcxc" podStartSLOduration=123.713306823 podStartE2EDuration="2m3.713306823s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.712662064 +0000 UTC m=+142.473986475" watchObservedRunningTime="2025-12-03 23:43:26.713306823 +0000 UTC m=+142.474631234" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.743514 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-f2nwm" podStartSLOduration=123.74349718 podStartE2EDuration="2m3.74349718s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:26.743297075 +0000 UTC m=+142.504621486" watchObservedRunningTime="2025-12-03 23:43:26.74349718 +0000 UTC m=+142.504821591" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.765306 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.765753 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.766459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.766793 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.266778601 +0000 UTC m=+143.028103012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.786061 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.866926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.867335 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.367320597 +0000 UTC m=+143.128645008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.951318 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" event={"ID":"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0","Type":"ContainerStarted","Data":"2a353a61db595d3786506b0fd0e0407db1f3baa374069b75863dcc2ad41548e7"} Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.972006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" event={"ID":"e21b87e1-9839-4439-89ac-32f6a196774b","Type":"ContainerStarted","Data":"7a50d1084d9269e001e4450e27d627a2645da4be11a8d07609a552a56dddc577"} Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.973191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:26 crc kubenswrapper[4764]: E1203 23:43:26.973551 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.473534544 +0000 UTC m=+143.234858955 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:26 crc kubenswrapper[4764]: I1203 23:43:26.992764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" event={"ID":"6042f2bb-d728-462a-a118-e2712e9a214f","Type":"ContainerStarted","Data":"bc46a37d5a27acf53ac4173aee1398e6974667c9b57c720b9f2fbad6c6095bad"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.020187 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" event={"ID":"0b194a71-9d06-4186-8811-8668eaae38b0","Type":"ContainerStarted","Data":"df7d8e5351cdf7e3fd90dd170de5dc0f7a5a85f7a42ea390669a08a53bb09983"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.020240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" event={"ID":"0b194a71-9d06-4186-8811-8668eaae38b0","Type":"ContainerStarted","Data":"ed90de37d9f5763c730241fd1a72805a790921a02f07f7a931572d6bf4f01ef6"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.020561 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.074015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" event={"ID":"0af14017-eba4-401e-9f16-d4a7fd22b6b8","Type":"ContainerStarted","Data":"32535e49f91c3db99ee81fe70693f33f942bd15568ecfcb7631b2afd54921672"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.074492 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.075939 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.575900231 +0000 UTC m=+143.337224642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.095039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" event={"ID":"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb","Type":"ContainerStarted","Data":"a88fe7bde0406444b8f9c1e9b51de0ad2cbe72233c08d56f02bf21ab7ca36b50"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.095083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" event={"ID":"eeddcfa4-47f1-4eb6-a26f-4e7fe0d3c7bb","Type":"ContainerStarted","Data":"f5f9a34431df93e3927942951072c04a3d90953759c821621451f8163f1d2aab"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.102015 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:27 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:27 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:27 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.102102 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.114390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" event={"ID":"220eeac0-5b43-462d-89cc-5182a6b1f686","Type":"ContainerStarted","Data":"83dee69a305d81012986a61e20e4a0e4baaf0715ff083fe7493c972bf8c9c231"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.114448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" event={"ID":"220eeac0-5b43-462d-89cc-5182a6b1f686","Type":"ContainerStarted","Data":"f109d6458e1242fa9a7d2bc51060686bc700a5612d3f901afde32aa57e78eae8"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.139752 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" podStartSLOduration=123.139706204 podStartE2EDuration="2m3.139706204s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.08394376 +0000 UTC m=+142.845268171" watchObservedRunningTime="2025-12-03 23:43:27.139706204 +0000 UTC m=+142.901030615" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.142065 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ln9l" podStartSLOduration=124.14205706 podStartE2EDuration="2m4.14205706s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.140849686 +0000 UTC m=+142.902174107" watchObservedRunningTime="2025-12-03 23:43:27.14205706 +0000 UTC m=+142.903381471" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.170018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" event={"ID":"fb55c6e2-2206-4735-ba1a-bfbff1e7549a","Type":"ContainerStarted","Data":"e043c9d2b3374daf8f086aef96ea040404c55b1325de4ec3c5c21b70b1857896"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.172282 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-d7hth" event={"ID":"ad151012-706a-4e5d-9565-70f905dd6a87","Type":"ContainerStarted","Data":"fa0b13d8f1b1d230c752747838f8ec187c3381dd1ff03c45e5eb23f18de25a3e"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.177832 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.179613 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.679599537 +0000 UTC m=+143.440923948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.205413 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" event={"ID":"28c57be9-2500-4944-abed-6fe2e4e2dd0d","Type":"ContainerStarted","Data":"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.206173 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.215146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" event={"ID":"f919d5d0-c940-484d-9017-ada61860924a","Type":"ContainerStarted","Data":"40328480919d04a1ffa2048e74bce522cbd5b1ef681af706ce6ae0e1eee6a64c"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.215270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" event={"ID":"f919d5d0-c940-484d-9017-ada61860924a","Type":"ContainerStarted","Data":"109c4537c67f77d649e7867e995562ac770d2442e2996b01487108e8cc6376db"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.225909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" event={"ID":"22c49dcd-28aa-42f9-92fd-7e033cba7846","Type":"ContainerStarted","Data":"9d7781d4dfffef1bd0fca0001c97b1c71a7e96bf32e24a8cdcd21488a00acb26"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.255777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" event={"ID":"163c97f5-601b-47dd-8653-5dffdf157659","Type":"ContainerStarted","Data":"e52f47bd180e1fe33b03b7fc45a613d4536ed684bd6af085d69038cd2cc2a6a9"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.256013 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" event={"ID":"163c97f5-601b-47dd-8653-5dffdf157659","Type":"ContainerStarted","Data":"724104cc1ae6ffc5f37aadad2d3012b6954540ced08563f15d1fa1b738d587fb"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.255889 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-vhrb8" podStartSLOduration=124.255868623 podStartE2EDuration="2m4.255868623s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.189054605 +0000 UTC m=+142.950379016" watchObservedRunningTime="2025-12-03 23:43:27.255868623 +0000 UTC m=+143.017193034" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.256501 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" podStartSLOduration=124.256491981 podStartE2EDuration="2m4.256491981s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.243739939 +0000 UTC m=+143.005064350" watchObservedRunningTime="2025-12-03 23:43:27.256491981 +0000 UTC m=+143.017816392" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.274103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" event={"ID":"b9c9e052-a750-4b4f-873c-9808c0a3c75b","Type":"ContainerStarted","Data":"2603f706277452e1349781dc54613bef7b71d0b18b14cb79b1935e52b91eee32"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.278374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.278480 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" podStartSLOduration=124.278469835 podStartE2EDuration="2m4.278469835s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.278000772 +0000 UTC m=+143.039325183" watchObservedRunningTime="2025-12-03 23:43:27.278469835 +0000 UTC m=+143.039794246" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.279583 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.779563776 +0000 UTC m=+143.540888177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.298735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" event={"ID":"cacf2ed9-cf5d-4876-93d2-690ba449e153","Type":"ContainerStarted","Data":"17d40a9450eee1d1bedb9564ca6902b9f3b60490cd6afcdfa3e6c408e583f148"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.299304 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.305451 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qr6vp" podStartSLOduration=124.305438581 podStartE2EDuration="2m4.305438581s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.304050932 +0000 UTC m=+143.065375343" watchObservedRunningTime="2025-12-03 23:43:27.305438581 +0000 UTC m=+143.066762992" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.311986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lqqs" event={"ID":"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82","Type":"ContainerStarted","Data":"b4b18007338ef8225ed2525fa33433f5aa7afe48d3574815c1bc51ea2ddaca63"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.312023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lqqs" event={"ID":"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82","Type":"ContainerStarted","Data":"ff770f12b8b9655b352579080b6ce6beb8b8ae8cc0d3d9851d6945f3c1cc904f"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.347924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" event={"ID":"75845c6a-1b60-4846-a1ef-7e719c5ce398","Type":"ContainerStarted","Data":"2d1aec009823a9c38ff73d742c4430e19a18330004b0b6ab46f9bd1170f8cea4"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.382585 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.385562 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.885550466 +0000 UTC m=+143.646874867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.389535 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tw5p6" event={"ID":"d1a404e9-c110-4033-9ef0-52c3d6a28d87","Type":"ContainerStarted","Data":"5d07316fb5e29d17bfbf320e733760f08514ddf7516417bb812667b0423ddd65"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.414120 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-pc59h" podStartSLOduration=124.414099127 podStartE2EDuration="2m4.414099127s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.378131636 +0000 UTC m=+143.139456047" watchObservedRunningTime="2025-12-03 23:43:27.414099127 +0000 UTC m=+143.175423548" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.416207 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.417111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.436544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" event={"ID":"c34ace5f-42af-4058-9792-acfc9340252e","Type":"ContainerStarted","Data":"6cc2c7df72628fdc77b1ae91978709eead1d390b433b07231a9f24f296d6a00d"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.458065 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j6rcl" podStartSLOduration=124.458045035 podStartE2EDuration="2m4.458045035s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.436923636 +0000 UTC m=+143.198248047" watchObservedRunningTime="2025-12-03 23:43:27.458045035 +0000 UTC m=+143.219369436" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.465938 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.473235 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" event={"ID":"8c4ea958-3175-42d0-9e0a-26225cada08a","Type":"ContainerStarted","Data":"2ea56d665de9d34778a0fb5859c572d0f038f5f7d8fd65b33bfec63f5e638629"} Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.484029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.484371 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.484450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.484541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42drr\" (UniqueName: \"kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.485060 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:27.985042652 +0000 UTC m=+143.746367063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.497343 4764 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sws9j container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.497394 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.497750 4764 patch_prober.go:28] interesting pod/downloads-7954f5f757-f2nwm container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.497775 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-f2nwm" podUID="d64b774a-681b-4c0f-b2eb-36398275e451" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.497789 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.502773 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.503190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-jr756" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.512947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qb6kg" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.561566 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" podStartSLOduration=123.561550385 podStartE2EDuration="2m3.561550385s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.560683721 +0000 UTC m=+143.322008132" watchObservedRunningTime="2025-12-03 23:43:27.561550385 +0000 UTC m=+143.322874796" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.564903 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-phcd4" podStartSLOduration=124.56489286 podStartE2EDuration="2m4.56489286s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.503063104 +0000 UTC m=+143.264387505" watchObservedRunningTime="2025-12-03 23:43:27.56489286 +0000 UTC m=+143.326217271" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.585876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.586142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42drr\" (UniqueName: \"kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.586286 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.586308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.590186 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.591051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.594706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.594966 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.094951754 +0000 UTC m=+143.856276165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.619417 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-mh8l6" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.619517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.634444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.636724 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tw5p6" podStartSLOduration=7.636694179 podStartE2EDuration="7.636694179s" podCreationTimestamp="2025-12-03 23:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.634742783 +0000 UTC m=+143.396067194" watchObservedRunningTime="2025-12-03 23:43:27.636694179 +0000 UTC m=+143.398018590" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.645901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42drr\" (UniqueName: \"kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr\") pod \"certified-operators-4dlcs\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.646052 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.688417 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.688621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.688660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.688708 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.188684495 +0000 UTC m=+143.950008916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.688753 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b4m\" (UniqueName: \"kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.723053 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7tgwq" podStartSLOduration=124.723038211 podStartE2EDuration="2m4.723038211s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.721031664 +0000 UTC m=+143.482356075" watchObservedRunningTime="2025-12-03 23:43:27.723038211 +0000 UTC m=+143.484362622" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.750664 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qxbf2" podStartSLOduration=124.750643755 podStartE2EDuration="2m4.750643755s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.748123233 +0000 UTC m=+143.509447644" watchObservedRunningTime="2025-12-03 23:43:27.750643755 +0000 UTC m=+143.511968166" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.792998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.793357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.793396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.793434 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.793477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b4m\" (UniqueName: \"kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.794082 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.294069239 +0000 UTC m=+144.055393650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.794100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.794300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.797982 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v245m" podStartSLOduration=124.797970489 podStartE2EDuration="2m4.797970489s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.774992147 +0000 UTC m=+143.536316558" watchObservedRunningTime="2025-12-03 23:43:27.797970489 +0000 UTC m=+143.559294900" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.799657 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.800509 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.829530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b4m\" (UniqueName: \"kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m\") pod \"community-operators-vn9sk\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.846125 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.897274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.897403 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.897481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvn7\" (UniqueName: \"kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.897524 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: E1203 23:43:27.897620 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.397606509 +0000 UTC m=+144.158930920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.968970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.970215 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8wjsw" podStartSLOduration=124.970196451 podStartE2EDuration="2m4.970196451s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:27.964055777 +0000 UTC m=+143.725380188" watchObservedRunningTime="2025-12-03 23:43:27.970196451 +0000 UTC m=+143.731520862" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.998482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.998571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvn7\" (UniqueName: \"kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.998617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.998637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.999030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:27 crc kubenswrapper[4764]: I1203 23:43:27.999409 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:27.999843 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.499832683 +0000 UTC m=+144.261157094 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.014284 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.015165 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.020487 4764 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5bbgm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.020564 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" podUID="0b194a71-9d06-4186-8811-8668eaae38b0" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.040436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvn7\" (UniqueName: \"kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7\") pod \"certified-operators-zwfxp\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.064746 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.100836 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.101337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.101416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.101448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwlzd\" (UniqueName: \"kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.101549 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.601535011 +0000 UTC m=+144.362859422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.105263 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:28 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:28 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:28 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.105312 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.184116 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.206910 4764 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-cgmmp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.206980 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.207378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwlzd\" (UniqueName: \"kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.207445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.207497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.207526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.208615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.208736 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.230771 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.730729931 +0000 UTC m=+144.492054342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.248674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwlzd\" (UniqueName: \"kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd\") pod \"community-operators-gbjlz\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.311082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.311565 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.811549986 +0000 UTC m=+144.572874397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.337263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.370992 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.414534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.414855 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:28.91484408 +0000 UTC m=+144.676168481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.510541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" event={"ID":"e21b87e1-9839-4439-89ac-32f6a196774b","Type":"ContainerStarted","Data":"c159cfb63c8606ab638b637fcb8a958f0b37f66887d20961dc0eb18cc2db9bed"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.516178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.516873 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.016829067 +0000 UTC m=+144.778153478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.518906 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.018893596 +0000 UTC m=+144.780218007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.517087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.533851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" event={"ID":"6042f2bb-d728-462a-a118-e2712e9a214f","Type":"ContainerStarted","Data":"cc759fe3245de0e774590255dd26eb89176499ef1f684c9ddb71b61a65b96d64"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.534813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-66kxp" podStartSLOduration=124.534796387 podStartE2EDuration="2m4.534796387s" podCreationTimestamp="2025-12-03 23:41:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:28.534302873 +0000 UTC m=+144.295627284" watchObservedRunningTime="2025-12-03 23:43:28.534796387 +0000 UTC m=+144.296120798" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.589907 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.589962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" event={"ID":"e8324e0f-a676-4015-bbcb-cf68235eb72a","Type":"ContainerStarted","Data":"63e65d5421b330a7fe18ebeddb492d9d7bb7829b15ffc4dc6e69138f6a3cec8e"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.589995 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" event={"ID":"e8324e0f-a676-4015-bbcb-cf68235eb72a","Type":"ContainerStarted","Data":"c45acbfdd313d5851999d31482907b46fec28ce03347985a1304d3c71d942ad3"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.608982 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" podStartSLOduration=125.608965874 podStartE2EDuration="2m5.608965874s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:28.607836442 +0000 UTC m=+144.369160853" watchObservedRunningTime="2025-12-03 23:43:28.608965874 +0000 UTC m=+144.370290285" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.621449 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.622256 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.122242021 +0000 UTC m=+144.883566432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.635260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2lqqs" event={"ID":"f30cc0ba-74c2-44f7-adc3-9f80cb4dac82","Type":"ContainerStarted","Data":"23a2580be687e551e8621c5202e689f37e4bd8238bc6316cdd9b75be2fe588b0"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.635563 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.641944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerStarted","Data":"fda7394c01e0633a382eb2af3af341e0a35cd5e611241654f852ddc3cdcc7b5d"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.652580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" event={"ID":"4d0ddfe9-eff3-4b36-b863-5e0d12a3fcf0","Type":"ContainerStarted","Data":"26d3f92f46d3b3e99fe6c2af4f42d1795de7cbf70205c65cc6659cd96c8fbade"} Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.662145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.663026 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5bbgm" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.663072 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.677020 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2lqqs" podStartSLOduration=8.676996956 podStartE2EDuration="8.676996956s" podCreationTimestamp="2025-12-03 23:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:28.667249729 +0000 UTC m=+144.428574140" watchObservedRunningTime="2025-12-03 23:43:28.676996956 +0000 UTC m=+144.438321367" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.716797 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-glq4z" podStartSLOduration=125.716782786 podStartE2EDuration="2m5.716782786s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:28.716069096 +0000 UTC m=+144.477393507" watchObservedRunningTime="2025-12-03 23:43:28.716782786 +0000 UTC m=+144.478107187" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.727229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.733050 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.233037408 +0000 UTC m=+144.994361819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.806936 4764 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.830822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.831528 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.331501365 +0000 UTC m=+145.092825776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.889831 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.919785 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:43:28 crc kubenswrapper[4764]: I1203 23:43:28.936489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:28 crc kubenswrapper[4764]: E1203 23:43:28.936810 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.436797175 +0000 UTC m=+145.198121586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:28 crc kubenswrapper[4764]: W1203 23:43:28.968526 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bc496d_9c32_491f_95da_41cf3850be09.slice/crio-75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9 WatchSource:0}: Error finding container 75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9: Status 404 returned error can't find the container with id 75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.037098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.037291 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.537260109 +0000 UTC m=+145.298584520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.037486 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.037816 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.537800154 +0000 UTC m=+145.299124575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.091943 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:29 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:29 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:29 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.092015 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.138490 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.138666 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.638639988 +0000 UTC m=+145.399964399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.138826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.139179 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.639165933 +0000 UTC m=+145.400490344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.240014 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.240797 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.740777779 +0000 UTC m=+145.502102190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.342275 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.342794 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.842708894 +0000 UTC m=+145.604033305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.443471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.443812 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:29.943792955 +0000 UTC m=+145.705117366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.544690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.545062 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:30.045050391 +0000 UTC m=+145.806374792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.589774 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.591045 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.598145 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.603588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.646256 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.646396 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:30.146369709 +0000 UTC m=+145.907694120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.646448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.646489 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.646542 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqm79\" (UniqueName: \"kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.646577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.646802 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 23:43:30.146792821 +0000 UTC m=+145.908117232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-k47db" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.657490 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerID="c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740" exitCode=0 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.657607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerDied","Data":"c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.659226 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.659514 4764 generic.go:334] "Generic (PLEG): container finished" podID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerID="00aef84fa38aa27da61060ffdc0479899bbe578d526ad659144bf016f07c2697" exitCode=0 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.659581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerDied","Data":"00aef84fa38aa27da61060ffdc0479899bbe578d526ad659144bf016f07c2697"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.659764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerStarted","Data":"53cb7fa0d6d7ec42dbc894759835e28618e7fd0172d09c38c1929326316605e7"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.663354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" event={"ID":"6042f2bb-d728-462a-a118-e2712e9a214f","Type":"ContainerStarted","Data":"2cfdd63a9cdb68b82d4e6e80ff725e6b864b663f2a107e73eb1242b9b8523f5a"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.663394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" event={"ID":"6042f2bb-d728-462a-a118-e2712e9a214f","Type":"ContainerStarted","Data":"affde3399078f0f90242c64938862b461aff0c6847895ba4edfccecb5f978035"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.663408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" event={"ID":"6042f2bb-d728-462a-a118-e2712e9a214f","Type":"ContainerStarted","Data":"a2190eee5e591e557655e7525a889c427f2333e286339572ac17b552db84c28a"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.669475 4764 generic.go:334] "Generic (PLEG): container finished" podID="e5bc496d-9c32-491f-95da-41cf3850be09" containerID="4377b022c80187209305eeb20d8aadbd1cd96f857a3b52d5f041f2f6eacd9a51" exitCode=0 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.669568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerDied","Data":"4377b022c80187209305eeb20d8aadbd1cd96f857a3b52d5f041f2f6eacd9a51"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.669601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerStarted","Data":"75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.671510 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerID="d28d8ac0ba390876af5127f46e71ac9e07451d5c46e79c1311c749c86ebe3ddf" exitCode=0 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.671687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerDied","Data":"d28d8ac0ba390876af5127f46e71ac9e07451d5c46e79c1311c749c86ebe3ddf"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.671746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerStarted","Data":"53e63d0fa266ef727c79fe5522f9d3c49c436d6cb91b416bc8a1b64b56b197f3"} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.724509 4764 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T23:43:28.807165813Z","Handler":null,"Name":""} Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.748201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.748579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.748823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqm79\" (UniqueName: \"kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.748909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: E1203 23:43:29.749909 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 23:43:30.249886589 +0000 UTC m=+146.011211020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.750440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.752924 4764 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.752968 4764 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.753040 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.788006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqm79\" (UniqueName: \"kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79\") pod \"redhat-marketplace-wdjgl\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.829202 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-sm25h" podStartSLOduration=9.829179262 podStartE2EDuration="9.829179262s" podCreationTimestamp="2025-12-03 23:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:29.826811334 +0000 UTC m=+145.588135745" watchObservedRunningTime="2025-12-03 23:43:29.829179262 +0000 UTC m=+145.590503673" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.850511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.857314 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.857367 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.910184 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.917588 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-k47db\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.951303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.962110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.994257 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:43:29 crc kubenswrapper[4764]: I1203 23:43:29.995589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.013327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.052434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.052531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jctsc\" (UniqueName: \"kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.052589 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.106884 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:30 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:30 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:30 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.107197 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.107064 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.153343 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.153404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jctsc\" (UniqueName: \"kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.153441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.154018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.154224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.182012 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jctsc\" (UniqueName: \"kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc\") pod \"redhat-marketplace-d7jhj\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.256439 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.312962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.396173 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.561869 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.614687 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.616357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.620777 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.635693 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.687522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" event={"ID":"c32eecc0-7e82-4d0b-bdbf-36fe53c01065","Type":"ContainerStarted","Data":"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c"} Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.687559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" event={"ID":"c32eecc0-7e82-4d0b-bdbf-36fe53c01065","Type":"ContainerStarted","Data":"b5c058fa6fba1f321db5a31630c4859c2b116d77056742dbeaabf37ae7c27509"} Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.688430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.696353 4764 generic.go:334] "Generic (PLEG): container finished" podID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerID="c77752d707616d56cc26d3350cc2ecb3c2c94474602bea4019408fd1e67bc1ff" exitCode=0 Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.697687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerDied","Data":"c77752d707616d56cc26d3350cc2ecb3c2c94474602bea4019408fd1e67bc1ff"} Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.697733 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerStarted","Data":"3499adadd68626c5c313b03d2b97e181cf9a9ac71f447f9ac143f58fc523ff27"} Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.700474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.700538 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.700634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v26s2\" (UniqueName: \"kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.743112 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" podStartSLOduration=127.743094489 podStartE2EDuration="2m7.743094489s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:30.719993553 +0000 UTC m=+146.481317964" watchObservedRunningTime="2025-12-03 23:43:30.743094489 +0000 UTC m=+146.504418900" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.764218 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.764950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.767005 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.767180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.770236 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.791084 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.793341 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.794758 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.803333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.803411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.803582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v26s2\" (UniqueName: \"kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.804252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.805800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.825794 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v26s2\" (UniqueName: \"kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2\") pod \"redhat-operators-zz8jv\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.880241 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.905002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.905060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.905106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwn5r\" (UniqueName: \"kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.905137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.905183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.915969 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:43:30 crc kubenswrapper[4764]: I1203 23:43:30.938125 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009100 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwn5r\" (UniqueName: \"kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.009888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.010045 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.010176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.033236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.034494 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwn5r\" (UniqueName: \"kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r\") pod \"redhat-operators-z4gdg\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.086403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.093365 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:31 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:31 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:31 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.093502 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.114981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.322420 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:43:31 crc kubenswrapper[4764]: W1203 23:43:31.378304 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d09c74d_dc25_4769_9580_88f3ea4fcf8e.slice/crio-d2d89ac0028e6b538f77e53570e757f07220f5d22677fa98597c8298ef26b95b WatchSource:0}: Error finding container d2d89ac0028e6b538f77e53570e757f07220f5d22677fa98597c8298ef26b95b: Status 404 returned error can't find the container with id d2d89ac0028e6b538f77e53570e757f07220f5d22677fa98597c8298ef26b95b Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.417104 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.417159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.417831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.443375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.464876 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.518404 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.518519 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.522177 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.532175 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.574597 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 23:43:31 crc kubenswrapper[4764]: W1203 23:43:31.598389 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9c2be04c_6e4b_43f3_b55c_1966078088a2.slice/crio-6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34 WatchSource:0}: Error finding container 6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34: Status 404 returned error can't find the container with id 6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34 Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.613486 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:43:31 crc kubenswrapper[4764]: W1203 23:43:31.690958 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a2e712_9c08_4688_905c_4aa0af0a2dab.slice/crio-90d129e59472fc244b710da977c19f63750cb7e1cf40481720359c41815e8770 WatchSource:0}: Error finding container 90d129e59472fc244b710da977c19f63750cb7e1cf40481720359c41815e8770: Status 404 returned error can't find the container with id 90d129e59472fc244b710da977c19f63750cb7e1cf40481720359c41815e8770 Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.777222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.786503 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.811299 4764 generic.go:334] "Generic (PLEG): container finished" podID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerID="069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775" exitCode=0 Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.811609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerDied","Data":"069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775"} Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.811662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerStarted","Data":"9405ece50c5fd33927258e287d115bf77df0fb4a1cd02afaac8716a4d9482c10"} Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.814180 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c2be04c-6e4b-43f3-b55c-1966078088a2","Type":"ContainerStarted","Data":"6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34"} Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.817537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerStarted","Data":"f737f2b3826e4e3014d4c9b02de2bbbe502f1b5ce922e18f72f125acd4b60af9"} Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.817564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerStarted","Data":"d2d89ac0028e6b538f77e53570e757f07220f5d22677fa98597c8298ef26b95b"} Dec 03 23:43:31 crc kubenswrapper[4764]: W1203 23:43:31.857145 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-c1e06ac5e11114a71a5f957a979d9e18c2d832e2174bf5c9e1173d67334f98b2 WatchSource:0}: Error finding container c1e06ac5e11114a71a5f957a979d9e18c2d832e2174bf5c9e1173d67334f98b2: Status 404 returned error can't find the container with id c1e06ac5e11114a71a5f957a979d9e18c2d832e2174bf5c9e1173d67334f98b2 Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.977202 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.977519 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.989966 4764 patch_prober.go:28] interesting pod/console-f9d7485db-l7xdw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 03 23:43:31 crc kubenswrapper[4764]: I1203 23:43:31.990031 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-l7xdw" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.099468 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:32 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:32 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:32 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.099517 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.426232 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-f2nwm" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.475086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.475140 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.481429 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:32 crc kubenswrapper[4764]: W1203 23:43:32.503816 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-4b680b598b9ac3883840477fb65a0017bb1978778130c8d04f5c50842faeb5ff WatchSource:0}: Error finding container 4b680b598b9ac3883840477fb65a0017bb1978778130c8d04f5c50842faeb5ff: Status 404 returned error can't find the container with id 4b680b598b9ac3883840477fb65a0017bb1978778130c8d04f5c50842faeb5ff Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.831479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bf0e64f27119637dc9deadb3d666da9f0a95f4a2b183d9888490da9f3484ddca"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.831844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c1e06ac5e11114a71a5f957a979d9e18c2d832e2174bf5c9e1173d67334f98b2"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.841874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e8322e88d687cc11fdb9d0238f3380aff3904c8c49ea869dd3e0b986d559f8a2"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.841937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4b680b598b9ac3883840477fb65a0017bb1978778130c8d04f5c50842faeb5ff"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.847667 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerID="b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f" exitCode=0 Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.848160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerDied","Data":"b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.848229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerStarted","Data":"90d129e59472fc244b710da977c19f63750cb7e1cf40481720359c41815e8770"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.856628 4764 generic.go:334] "Generic (PLEG): container finished" podID="9c2be04c-6e4b-43f3-b55c-1966078088a2" containerID="e36fc7b57993b2115dc5e3108ea9ffb99899cbeb5b472b958cc756cbe1d1d828" exitCode=0 Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.856797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c2be04c-6e4b-43f3-b55c-1966078088a2","Type":"ContainerDied","Data":"e36fc7b57993b2115dc5e3108ea9ffb99899cbeb5b472b958cc756cbe1d1d828"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.868054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dec9ad31d9be01ad33abe8f9f51cd01829d5d5f9248c124d77bacb3436d9a0a5"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.868095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"49a9d4b4270673eed7c01eec5f7fa39d895655698786712acedaaa834f2ee4c3"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.871051 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.899440 4764 generic.go:334] "Generic (PLEG): container finished" podID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerID="f737f2b3826e4e3014d4c9b02de2bbbe502f1b5ce922e18f72f125acd4b60af9" exitCode=0 Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.900412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerDied","Data":"f737f2b3826e4e3014d4c9b02de2bbbe502f1b5ce922e18f72f125acd4b60af9"} Dec 03 23:43:32 crc kubenswrapper[4764]: I1203 23:43:32.906271 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tl5lg" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.090854 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.097075 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:33 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:33 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:33 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.097110 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.214848 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.215483 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.219350 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.223563 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.224066 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.262028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.262126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.363504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.363581 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.363683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.398976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.545342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.783285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 23:43:33 crc kubenswrapper[4764]: W1203 23:43:33.816830 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1a292c37_c957_4c88_9ea7_486e96915143.slice/crio-520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329 WatchSource:0}: Error finding container 520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329: Status 404 returned error can't find the container with id 520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329 Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.916999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1a292c37-c957-4c88-9ea7-486e96915143","Type":"ContainerStarted","Data":"520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329"} Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.920034 4764 generic.go:334] "Generic (PLEG): container finished" podID="220eeac0-5b43-462d-89cc-5182a6b1f686" containerID="83dee69a305d81012986a61e20e4a0e4baaf0715ff083fe7493c972bf8c9c231" exitCode=0 Dec 03 23:43:33 crc kubenswrapper[4764]: I1203 23:43:33.920071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" event={"ID":"220eeac0-5b43-462d-89cc-5182a6b1f686","Type":"ContainerDied","Data":"83dee69a305d81012986a61e20e4a0e4baaf0715ff083fe7493c972bf8c9c231"} Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.096909 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:34 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:34 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:34 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.097229 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.243304 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.290798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir\") pod \"9c2be04c-6e4b-43f3-b55c-1966078088a2\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.290845 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access\") pod \"9c2be04c-6e4b-43f3-b55c-1966078088a2\" (UID: \"9c2be04c-6e4b-43f3-b55c-1966078088a2\") " Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.290940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c2be04c-6e4b-43f3-b55c-1966078088a2" (UID: "9c2be04c-6e4b-43f3-b55c-1966078088a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.291201 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c2be04c-6e4b-43f3-b55c-1966078088a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.296075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c2be04c-6e4b-43f3-b55c-1966078088a2" (UID: "9c2be04c-6e4b-43f3-b55c-1966078088a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.391958 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c2be04c-6e4b-43f3-b55c-1966078088a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.946308 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.946371 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"9c2be04c-6e4b-43f3-b55c-1966078088a2","Type":"ContainerDied","Data":"6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34"} Dec 03 23:43:34 crc kubenswrapper[4764]: I1203 23:43:34.946406 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6724c7f861003d1ef61b46d79d37f10292ce5d8b626262e2454ff89c6ed0ff34" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.094952 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:35 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:35 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:35 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.095018 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.300674 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.405676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume" (OuterVolumeSpecName: "config-volume") pod "220eeac0-5b43-462d-89cc-5182a6b1f686" (UID: "220eeac0-5b43-462d-89cc-5182a6b1f686"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.405734 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume\") pod \"220eeac0-5b43-462d-89cc-5182a6b1f686\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.405834 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume\") pod \"220eeac0-5b43-462d-89cc-5182a6b1f686\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.406034 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkjts\" (UniqueName: \"kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts\") pod \"220eeac0-5b43-462d-89cc-5182a6b1f686\" (UID: \"220eeac0-5b43-462d-89cc-5182a6b1f686\") " Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.406442 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/220eeac0-5b43-462d-89cc-5182a6b1f686-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.426602 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts" (OuterVolumeSpecName: "kube-api-access-lkjts") pod "220eeac0-5b43-462d-89cc-5182a6b1f686" (UID: "220eeac0-5b43-462d-89cc-5182a6b1f686"). InnerVolumeSpecName "kube-api-access-lkjts". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.426912 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "220eeac0-5b43-462d-89cc-5182a6b1f686" (UID: "220eeac0-5b43-462d-89cc-5182a6b1f686"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.507299 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/220eeac0-5b43-462d-89cc-5182a6b1f686-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:35 crc kubenswrapper[4764]: I1203 23:43:35.507337 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkjts\" (UniqueName: \"kubernetes.io/projected/220eeac0-5b43-462d-89cc-5182a6b1f686-kube-api-access-lkjts\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.004374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1a292c37-c957-4c88-9ea7-486e96915143","Type":"ContainerStarted","Data":"5d1e2f278b54961845f971c3dd0cc53319fc21f4041bfb3310ec268f68d01812"} Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.012313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" event={"ID":"220eeac0-5b43-462d-89cc-5182a6b1f686","Type":"ContainerDied","Data":"f109d6458e1242fa9a7d2bc51060686bc700a5612d3f901afde32aa57e78eae8"} Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.012346 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f109d6458e1242fa9a7d2bc51060686bc700a5612d3f901afde32aa57e78eae8" Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.012356 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8" Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.022432 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.022417465 podStartE2EDuration="3.022417465s" podCreationTimestamp="2025-12-03 23:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:43:36.0190766 +0000 UTC m=+151.780401011" watchObservedRunningTime="2025-12-03 23:43:36.022417465 +0000 UTC m=+151.783741876" Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.095815 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:36 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:36 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:36 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:36 crc kubenswrapper[4764]: I1203 23:43:36.095861 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:37 crc kubenswrapper[4764]: I1203 23:43:37.027116 4764 generic.go:334] "Generic (PLEG): container finished" podID="1a292c37-c957-4c88-9ea7-486e96915143" containerID="5d1e2f278b54961845f971c3dd0cc53319fc21f4041bfb3310ec268f68d01812" exitCode=0 Dec 03 23:43:37 crc kubenswrapper[4764]: I1203 23:43:37.027160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1a292c37-c957-4c88-9ea7-486e96915143","Type":"ContainerDied","Data":"5d1e2f278b54961845f971c3dd0cc53319fc21f4041bfb3310ec268f68d01812"} Dec 03 23:43:37 crc kubenswrapper[4764]: I1203 23:43:37.091440 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:37 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:37 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:37 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:37 crc kubenswrapper[4764]: I1203 23:43:37.091489 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:38 crc kubenswrapper[4764]: I1203 23:43:38.093623 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:38 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:38 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:38 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:38 crc kubenswrapper[4764]: I1203 23:43:38.093853 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:38 crc kubenswrapper[4764]: I1203 23:43:38.591636 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2lqqs" Dec 03 23:43:39 crc kubenswrapper[4764]: I1203 23:43:39.100076 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:39 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:39 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:39 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:39 crc kubenswrapper[4764]: I1203 23:43:39.100142 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:40 crc kubenswrapper[4764]: I1203 23:43:40.090964 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:40 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:40 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:40 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:40 crc kubenswrapper[4764]: I1203 23:43:40.091039 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:41 crc kubenswrapper[4764]: I1203 23:43:41.092135 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:41 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:41 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:41 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:41 crc kubenswrapper[4764]: I1203 23:43:41.092507 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:41 crc kubenswrapper[4764]: I1203 23:43:41.977681 4764 patch_prober.go:28] interesting pod/console-f9d7485db-l7xdw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Dec 03 23:43:41 crc kubenswrapper[4764]: I1203 23:43:41.977747 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-l7xdw" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" probeResult="failure" output="Get \"https://10.217.0.21:8443/health\": dial tcp 10.217.0.21:8443: connect: connection refused" Dec 03 23:43:42 crc kubenswrapper[4764]: I1203 23:43:42.090592 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:42 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:42 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:42 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:42 crc kubenswrapper[4764]: I1203 23:43:42.090670 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:43 crc kubenswrapper[4764]: I1203 23:43:43.091475 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:43 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:43 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:43 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:43 crc kubenswrapper[4764]: I1203 23:43:43.091807 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:44 crc kubenswrapper[4764]: I1203 23:43:44.092457 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:44 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:44 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:44 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:44 crc kubenswrapper[4764]: I1203 23:43:44.092552 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:44 crc kubenswrapper[4764]: I1203 23:43:44.979386 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.074318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir\") pod \"1a292c37-c957-4c88-9ea7-486e96915143\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.074427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access\") pod \"1a292c37-c957-4c88-9ea7-486e96915143\" (UID: \"1a292c37-c957-4c88-9ea7-486e96915143\") " Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.074483 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1a292c37-c957-4c88-9ea7-486e96915143" (UID: "1a292c37-c957-4c88-9ea7-486e96915143"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.074957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1a292c37-c957-4c88-9ea7-486e96915143","Type":"ContainerDied","Data":"520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329"} Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.075013 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520674f98347fd252b3d03df303e8346cf1940ff34dec9c4d767b5049cd25329" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.075096 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.075139 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1a292c37-c957-4c88-9ea7-486e96915143-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.082792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1a292c37-c957-4c88-9ea7-486e96915143" (UID: "1a292c37-c957-4c88-9ea7-486e96915143"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.093393 4764 patch_prober.go:28] interesting pod/router-default-5444994796-wxtcz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 23:43:45 crc kubenswrapper[4764]: [-]has-synced failed: reason withheld Dec 03 23:43:45 crc kubenswrapper[4764]: [+]process-running ok Dec 03 23:43:45 crc kubenswrapper[4764]: healthz check failed Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.093445 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wxtcz" podUID="ed746947-b1ba-426d-92a8-02db2a949e4b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 23:43:45 crc kubenswrapper[4764]: I1203 23:43:45.176269 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a292c37-c957-4c88-9ea7-486e96915143-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:43:46 crc kubenswrapper[4764]: I1203 23:43:46.095565 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:46 crc kubenswrapper[4764]: I1203 23:43:46.101661 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wxtcz" Dec 03 23:43:46 crc kubenswrapper[4764]: I1203 23:43:46.188488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:46 crc kubenswrapper[4764]: I1203 23:43:46.210301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/acd1bf47-f475-47f3-95a7-2e0cecec15aa-metrics-certs\") pod \"network-metrics-daemon-9fkg4\" (UID: \"acd1bf47-f475-47f3-95a7-2e0cecec15aa\") " pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:46 crc kubenswrapper[4764]: I1203 23:43:46.409300 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9fkg4" Dec 03 23:43:50 crc kubenswrapper[4764]: I1203 23:43:50.112688 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:43:50 crc kubenswrapper[4764]: I1203 23:43:50.868704 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:43:50 crc kubenswrapper[4764]: I1203 23:43:50.868800 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:43:51 crc kubenswrapper[4764]: I1203 23:43:51.984889 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:43:51 crc kubenswrapper[4764]: I1203 23:43:51.990905 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:44:02 crc kubenswrapper[4764]: E1203 23:44:02.375546 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 23:44:02 crc kubenswrapper[4764]: E1203 23:44:02.376202 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tnvn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zwfxp_openshift-marketplace(d1fde235-cf99-4964-ba2e-df36c34906b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:02 crc kubenswrapper[4764]: E1203 23:44:02.377964 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zwfxp" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" Dec 03 23:44:03 crc kubenswrapper[4764]: I1203 23:44:03.196997 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kqhxc" Dec 03 23:44:03 crc kubenswrapper[4764]: E1203 23:44:03.678568 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zwfxp" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.042489 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.042990 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jctsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-d7jhj_openshift-marketplace(54783f7d-0533-48d7-b80a-cc9e7941ebaf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.044206 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-d7jhj" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.098304 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.098449 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nqm79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wdjgl_openshift-marketplace(74e0af5e-bc95-4918-9c09-524e159e1eba): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:05 crc kubenswrapper[4764]: E1203 23:44:05.099654 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wdjgl" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" Dec 03 23:44:07 crc kubenswrapper[4764]: E1203 23:44:07.381667 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wdjgl" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" Dec 03 23:44:07 crc kubenswrapper[4764]: E1203 23:44:07.381776 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-d7jhj" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" Dec 03 23:44:07 crc kubenswrapper[4764]: E1203 23:44:07.463582 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 23:44:07 crc kubenswrapper[4764]: E1203 23:44:07.464016 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwn5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-z4gdg_openshift-marketplace(a8a2e712-9c08-4688-905c-4aa0af0a2dab): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:07 crc kubenswrapper[4764]: E1203 23:44:07.465185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-z4gdg" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.505042 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-z4gdg" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.584650 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.584949 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kwlzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gbjlz_openshift-marketplace(e5bc496d-9c32-491f-95da-41cf3850be09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.586112 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gbjlz" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.605537 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.605751 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v26s2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zz8jv_openshift-marketplace(8d09c74d-dc25-4769-9580-88f3ea4fcf8e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.606940 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zz8jv" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.609311 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.609437 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-65b4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vn9sk_openshift-marketplace(84e0b7ae-01df-4863-b257-afb9a27507cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.610661 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vn9sk" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.626619 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.626790 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-42drr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-4dlcs_openshift-marketplace(3ef5d8b9-01e5-4668-aa91-0406a52a40ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 23:44:08 crc kubenswrapper[4764]: E1203 23:44:08.629228 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-4dlcs" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" Dec 03 23:44:08 crc kubenswrapper[4764]: I1203 23:44:08.916195 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9fkg4"] Dec 03 23:44:09 crc kubenswrapper[4764]: I1203 23:44:09.232546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" event={"ID":"acd1bf47-f475-47f3-95a7-2e0cecec15aa","Type":"ContainerStarted","Data":"0704d23bf62ae86ca0d706203c256b07379b941485c6791ee466e7736dce0d89"} Dec 03 23:44:09 crc kubenswrapper[4764]: I1203 23:44:09.233057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" event={"ID":"acd1bf47-f475-47f3-95a7-2e0cecec15aa","Type":"ContainerStarted","Data":"5f0ae6e8d2b83389a965935bdf36bcabeaacd165026814e9e5b9f6b3ab87d01d"} Dec 03 23:44:09 crc kubenswrapper[4764]: E1203 23:44:09.235885 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vn9sk" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" Dec 03 23:44:09 crc kubenswrapper[4764]: E1203 23:44:09.235893 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zz8jv" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" Dec 03 23:44:09 crc kubenswrapper[4764]: E1203 23:44:09.235913 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-4dlcs" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" Dec 03 23:44:09 crc kubenswrapper[4764]: E1203 23:44:09.235947 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gbjlz" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" Dec 03 23:44:10 crc kubenswrapper[4764]: I1203 23:44:10.240305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9fkg4" event={"ID":"acd1bf47-f475-47f3-95a7-2e0cecec15aa","Type":"ContainerStarted","Data":"f46a760faef5e108051eb7cc973b4dc91598722d2826377fe29bfd0a5505d8a6"} Dec 03 23:44:10 crc kubenswrapper[4764]: I1203 23:44:10.254693 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9fkg4" podStartSLOduration=167.254676883 podStartE2EDuration="2m47.254676883s" podCreationTimestamp="2025-12-03 23:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:44:10.253092849 +0000 UTC m=+186.014417290" watchObservedRunningTime="2025-12-03 23:44:10.254676883 +0000 UTC m=+186.016001294" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.791197 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822233 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 23:44:11 crc kubenswrapper[4764]: E1203 23:44:11.822497 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c2be04c-6e4b-43f3-b55c-1966078088a2" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822512 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c2be04c-6e4b-43f3-b55c-1966078088a2" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: E1203 23:44:11.822529 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a292c37-c957-4c88-9ea7-486e96915143" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822538 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a292c37-c957-4c88-9ea7-486e96915143" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: E1203 23:44:11.822550 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="220eeac0-5b43-462d-89cc-5182a6b1f686" containerName="collect-profiles" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822559 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="220eeac0-5b43-462d-89cc-5182a6b1f686" containerName="collect-profiles" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c2be04c-6e4b-43f3-b55c-1966078088a2" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822685 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a292c37-c957-4c88-9ea7-486e96915143" containerName="pruner" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.822700 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="220eeac0-5b43-462d-89cc-5182a6b1f686" containerName="collect-profiles" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.823130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.829085 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.829292 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.835071 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.863869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.863953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.964669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.964768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.965075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:11 crc kubenswrapper[4764]: I1203 23:44:11.990515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:12 crc kubenswrapper[4764]: I1203 23:44:12.147674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:12 crc kubenswrapper[4764]: I1203 23:44:12.563492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 23:44:13 crc kubenswrapper[4764]: I1203 23:44:13.257451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"923d3354-6842-48f7-af71-af7a98a20fa2","Type":"ContainerStarted","Data":"2a17f93ed7e3047b2c617e026b488d5ff8097ced7b45e02c584cfa108088a95a"} Dec 03 23:44:13 crc kubenswrapper[4764]: I1203 23:44:13.257909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"923d3354-6842-48f7-af71-af7a98a20fa2","Type":"ContainerStarted","Data":"5a1080042bade4a6fc556f13800fd7e25c2fed194972a0ca624b810ef8b3aa5f"} Dec 03 23:44:13 crc kubenswrapper[4764]: I1203 23:44:13.290259 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.290235211 podStartE2EDuration="2.290235211s" podCreationTimestamp="2025-12-03 23:44:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:44:13.282215313 +0000 UTC m=+189.043539724" watchObservedRunningTime="2025-12-03 23:44:13.290235211 +0000 UTC m=+189.051559622" Dec 03 23:44:14 crc kubenswrapper[4764]: I1203 23:44:14.264843 4764 generic.go:334] "Generic (PLEG): container finished" podID="923d3354-6842-48f7-af71-af7a98a20fa2" containerID="2a17f93ed7e3047b2c617e026b488d5ff8097ced7b45e02c584cfa108088a95a" exitCode=0 Dec 03 23:44:14 crc kubenswrapper[4764]: I1203 23:44:14.264884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"923d3354-6842-48f7-af71-af7a98a20fa2","Type":"ContainerDied","Data":"2a17f93ed7e3047b2c617e026b488d5ff8097ced7b45e02c584cfa108088a95a"} Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.528469 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.606474 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir\") pod \"923d3354-6842-48f7-af71-af7a98a20fa2\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.606774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access\") pod \"923d3354-6842-48f7-af71-af7a98a20fa2\" (UID: \"923d3354-6842-48f7-af71-af7a98a20fa2\") " Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.606598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "923d3354-6842-48f7-af71-af7a98a20fa2" (UID: "923d3354-6842-48f7-af71-af7a98a20fa2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.607350 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/923d3354-6842-48f7-af71-af7a98a20fa2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.612903 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "923d3354-6842-48f7-af71-af7a98a20fa2" (UID: "923d3354-6842-48f7-af71-af7a98a20fa2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:15 crc kubenswrapper[4764]: I1203 23:44:15.707764 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/923d3354-6842-48f7-af71-af7a98a20fa2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:16 crc kubenswrapper[4764]: I1203 23:44:16.284098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"923d3354-6842-48f7-af71-af7a98a20fa2","Type":"ContainerDied","Data":"5a1080042bade4a6fc556f13800fd7e25c2fed194972a0ca624b810ef8b3aa5f"} Dec 03 23:44:16 crc kubenswrapper[4764]: I1203 23:44:16.284338 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a1080042bade4a6fc556f13800fd7e25c2fed194972a0ca624b810ef8b3aa5f" Dec 03 23:44:16 crc kubenswrapper[4764]: I1203 23:44:16.284396 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 23:44:16 crc kubenswrapper[4764]: I1203 23:44:16.293026 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerID="694112cfbd7101f48208aefb00022be1442e644e55909d3abcd41562bb52557f" exitCode=0 Dec 03 23:44:16 crc kubenswrapper[4764]: I1203 23:44:16.293052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerDied","Data":"694112cfbd7101f48208aefb00022be1442e644e55909d3abcd41562bb52557f"} Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.614157 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 23:44:17 crc kubenswrapper[4764]: E1203 23:44:17.614734 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="923d3354-6842-48f7-af71-af7a98a20fa2" containerName="pruner" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.614751 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="923d3354-6842-48f7-af71-af7a98a20fa2" containerName="pruner" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.614878 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="923d3354-6842-48f7-af71-af7a98a20fa2" containerName="pruner" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.615269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.617619 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.618261 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.629621 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.732543 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.732595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.732699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.834059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.834155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.834173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.834514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.834553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.857323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access\") pod \"installer-9-crc\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:17 crc kubenswrapper[4764]: I1203 23:44:17.931838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:18 crc kubenswrapper[4764]: I1203 23:44:18.307750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerStarted","Data":"ff66177fa95dd16a987c7cec6fe47c87b7e4c29feca2d33c0d60fcfd1a240da2"} Dec 03 23:44:18 crc kubenswrapper[4764]: I1203 23:44:18.332955 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zwfxp" podStartSLOduration=3.766203675 podStartE2EDuration="51.332935797s" podCreationTimestamp="2025-12-03 23:43:27 +0000 UTC" firstStartedPulling="2025-12-03 23:43:29.678816421 +0000 UTC m=+145.440140832" lastFinishedPulling="2025-12-03 23:44:17.245548543 +0000 UTC m=+193.006872954" observedRunningTime="2025-12-03 23:44:18.329643513 +0000 UTC m=+194.090967934" watchObservedRunningTime="2025-12-03 23:44:18.332935797 +0000 UTC m=+194.094260208" Dec 03 23:44:18 crc kubenswrapper[4764]: I1203 23:44:18.359730 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 23:44:18 crc kubenswrapper[4764]: W1203 23:44:18.376692 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podceb393b2_6803_4b90_8f7f_a1569931b27a.slice/crio-552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6 WatchSource:0}: Error finding container 552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6: Status 404 returned error can't find the container with id 552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6 Dec 03 23:44:19 crc kubenswrapper[4764]: I1203 23:44:19.317264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ceb393b2-6803-4b90-8f7f-a1569931b27a","Type":"ContainerStarted","Data":"67e351f84402164969bb3ac1fee457cd5f412deefbb095cdada318099c26ae46"} Dec 03 23:44:19 crc kubenswrapper[4764]: I1203 23:44:19.317499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ceb393b2-6803-4b90-8f7f-a1569931b27a","Type":"ContainerStarted","Data":"552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6"} Dec 03 23:44:19 crc kubenswrapper[4764]: I1203 23:44:19.331775 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.331754606 podStartE2EDuration="2.331754606s" podCreationTimestamp="2025-12-03 23:44:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:44:19.33013305 +0000 UTC m=+195.091457461" watchObservedRunningTime="2025-12-03 23:44:19.331754606 +0000 UTC m=+195.093079027" Dec 03 23:44:20 crc kubenswrapper[4764]: I1203 23:44:20.325282 4764 generic.go:334] "Generic (PLEG): container finished" podID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerID="c2b05dc87e4023a12df727e7a3511cfa179c9185b6768573e54b8ec82e8e5158" exitCode=0 Dec 03 23:44:20 crc kubenswrapper[4764]: I1203 23:44:20.326134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerDied","Data":"c2b05dc87e4023a12df727e7a3511cfa179c9185b6768573e54b8ec82e8e5158"} Dec 03 23:44:20 crc kubenswrapper[4764]: I1203 23:44:20.868756 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:44:20 crc kubenswrapper[4764]: I1203 23:44:20.869045 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:44:21 crc kubenswrapper[4764]: I1203 23:44:21.333586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerStarted","Data":"70c59ca6b10d02eab5718ee20fbecde9e963a55ebc182b1486cfbe913251ad13"} Dec 03 23:44:21 crc kubenswrapper[4764]: I1203 23:44:21.337051 4764 generic.go:334] "Generic (PLEG): container finished" podID="e5bc496d-9c32-491f-95da-41cf3850be09" containerID="2cb5c37f2cb10dded97449eeb3c2342f1d56fda7fbd51679cb02466c2f6cb21d" exitCode=0 Dec 03 23:44:21 crc kubenswrapper[4764]: I1203 23:44:21.337128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerDied","Data":"2cb5c37f2cb10dded97449eeb3c2342f1d56fda7fbd51679cb02466c2f6cb21d"} Dec 03 23:44:21 crc kubenswrapper[4764]: I1203 23:44:21.353794 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wdjgl" podStartSLOduration=2.347728613 podStartE2EDuration="52.353776129s" podCreationTimestamp="2025-12-03 23:43:29 +0000 UTC" firstStartedPulling="2025-12-03 23:43:30.700020346 +0000 UTC m=+146.461344757" lastFinishedPulling="2025-12-03 23:44:20.706067862 +0000 UTC m=+196.467392273" observedRunningTime="2025-12-03 23:44:21.353299515 +0000 UTC m=+197.114623926" watchObservedRunningTime="2025-12-03 23:44:21.353776129 +0000 UTC m=+197.115100540" Dec 03 23:44:22 crc kubenswrapper[4764]: I1203 23:44:22.343799 4764 generic.go:334] "Generic (PLEG): container finished" podID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerID="f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6" exitCode=0 Dec 03 23:44:22 crc kubenswrapper[4764]: I1203 23:44:22.343873 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerDied","Data":"f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6"} Dec 03 23:44:22 crc kubenswrapper[4764]: I1203 23:44:22.346117 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerStarted","Data":"6323439b1799b74f3d099a0efe93508fb38f55704644c69732d7e6de97c63176"} Dec 03 23:44:22 crc kubenswrapper[4764]: I1203 23:44:22.376767 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gbjlz" podStartSLOduration=3.193626284 podStartE2EDuration="55.376751739s" podCreationTimestamp="2025-12-03 23:43:27 +0000 UTC" firstStartedPulling="2025-12-03 23:43:29.671158623 +0000 UTC m=+145.432483034" lastFinishedPulling="2025-12-03 23:44:21.854284038 +0000 UTC m=+197.615608489" observedRunningTime="2025-12-03 23:44:22.375227158 +0000 UTC m=+198.136551559" watchObservedRunningTime="2025-12-03 23:44:22.376751739 +0000 UTC m=+198.138076150" Dec 03 23:44:23 crc kubenswrapper[4764]: I1203 23:44:23.351756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerStarted","Data":"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2"} Dec 03 23:44:23 crc kubenswrapper[4764]: I1203 23:44:23.353335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerStarted","Data":"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23"} Dec 03 23:44:23 crc kubenswrapper[4764]: I1203 23:44:23.389984 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d7jhj" podStartSLOduration=3.526846491 podStartE2EDuration="54.389971312s" podCreationTimestamp="2025-12-03 23:43:29 +0000 UTC" firstStartedPulling="2025-12-03 23:43:31.851219362 +0000 UTC m=+147.612543773" lastFinishedPulling="2025-12-03 23:44:22.714344183 +0000 UTC m=+198.475668594" observedRunningTime="2025-12-03 23:44:23.386732266 +0000 UTC m=+199.148056677" watchObservedRunningTime="2025-12-03 23:44:23.389971312 +0000 UTC m=+199.151295723" Dec 03 23:44:24 crc kubenswrapper[4764]: I1203 23:44:24.359471 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerID="1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2" exitCode=0 Dec 03 23:44:24 crc kubenswrapper[4764]: I1203 23:44:24.359524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerDied","Data":"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2"} Dec 03 23:44:24 crc kubenswrapper[4764]: I1203 23:44:24.362409 4764 generic.go:334] "Generic (PLEG): container finished" podID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerID="b7a79b4ea53875743255dc069570ee4f2e12b920a8486271434898051defd3a8" exitCode=0 Dec 03 23:44:24 crc kubenswrapper[4764]: I1203 23:44:24.362456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerDied","Data":"b7a79b4ea53875743255dc069570ee4f2e12b920a8486271434898051defd3a8"} Dec 03 23:44:26 crc kubenswrapper[4764]: I1203 23:44:26.377895 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerStarted","Data":"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5"} Dec 03 23:44:26 crc kubenswrapper[4764]: I1203 23:44:26.381681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerStarted","Data":"015d7437593405fa0e2a06ccb869ca9c4c3ee0e000bb2b2c21536573d6912baa"} Dec 03 23:44:26 crc kubenswrapper[4764]: I1203 23:44:26.404136 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z4gdg" podStartSLOduration=3.814195442 podStartE2EDuration="56.401699377s" podCreationTimestamp="2025-12-03 23:43:30 +0000 UTC" firstStartedPulling="2025-12-03 23:43:32.856722311 +0000 UTC m=+148.618046722" lastFinishedPulling="2025-12-03 23:44:25.444226246 +0000 UTC m=+201.205550657" observedRunningTime="2025-12-03 23:44:26.398487482 +0000 UTC m=+202.159811893" watchObservedRunningTime="2025-12-03 23:44:26.401699377 +0000 UTC m=+202.163023798" Dec 03 23:44:26 crc kubenswrapper[4764]: I1203 23:44:26.413612 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vn9sk" podStartSLOduration=3.730119255 podStartE2EDuration="59.413596493s" podCreationTimestamp="2025-12-03 23:43:27 +0000 UTC" firstStartedPulling="2025-12-03 23:43:29.661669844 +0000 UTC m=+145.422994255" lastFinishedPulling="2025-12-03 23:44:25.345147082 +0000 UTC m=+201.106471493" observedRunningTime="2025-12-03 23:44:26.412631558 +0000 UTC m=+202.173955969" watchObservedRunningTime="2025-12-03 23:44:26.413596493 +0000 UTC m=+202.174920914" Dec 03 23:44:27 crc kubenswrapper[4764]: I1203 23:44:27.970516 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:44:27 crc kubenswrapper[4764]: I1203 23:44:27.971261 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.185216 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.185462 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.224701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.228442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.371822 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.371878 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.422978 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.442005 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:28 crc kubenswrapper[4764]: I1203 23:44:28.469452 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:29 crc kubenswrapper[4764]: I1203 23:44:29.911145 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:44:29 crc kubenswrapper[4764]: I1203 23:44:29.911213 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:44:29 crc kubenswrapper[4764]: I1203 23:44:29.960508 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.313972 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.314047 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.403492 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.430250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerStarted","Data":"cdd72437bc81ccf7c8bdf75d479473ecdc27f81a24ee6b24874d84a768356628"} Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.480595 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:30 crc kubenswrapper[4764]: I1203 23:44:30.487428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.115344 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.116755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.189461 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.436915 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerID="a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83" exitCode=0 Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.437002 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerDied","Data":"a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83"} Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.441251 4764 generic.go:334] "Generic (PLEG): container finished" podID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerID="cdd72437bc81ccf7c8bdf75d479473ecdc27f81a24ee6b24874d84a768356628" exitCode=0 Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.441400 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerDied","Data":"cdd72437bc81ccf7c8bdf75d479473ecdc27f81a24ee6b24874d84a768356628"} Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.497531 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.543772 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.544018 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gbjlz" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="registry-server" containerID="cri-o://6323439b1799b74f3d099a0efe93508fb38f55704644c69732d7e6de97c63176" gracePeriod=2 Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.748127 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:44:31 crc kubenswrapper[4764]: I1203 23:44:31.748461 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zwfxp" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="registry-server" containerID="cri-o://ff66177fa95dd16a987c7cec6fe47c87b7e4c29feca2d33c0d60fcfd1a240da2" gracePeriod=2 Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.450130 4764 generic.go:334] "Generic (PLEG): container finished" podID="e5bc496d-9c32-491f-95da-41cf3850be09" containerID="6323439b1799b74f3d099a0efe93508fb38f55704644c69732d7e6de97c63176" exitCode=0 Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.450170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerDied","Data":"6323439b1799b74f3d099a0efe93508fb38f55704644c69732d7e6de97c63176"} Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.450610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gbjlz" event={"ID":"e5bc496d-9c32-491f-95da-41cf3850be09","Type":"ContainerDied","Data":"75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9"} Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.450625 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75eb5e35ba2bc25370ce73f0817a814d9305992cc109495ec2c910b9226a76b9" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.453240 4764 generic.go:334] "Generic (PLEG): container finished" podID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerID="ff66177fa95dd16a987c7cec6fe47c87b7e4c29feca2d33c0d60fcfd1a240da2" exitCode=0 Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.453320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerDied","Data":"ff66177fa95dd16a987c7cec6fe47c87b7e4c29feca2d33c0d60fcfd1a240da2"} Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.463511 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.655102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities\") pod \"e5bc496d-9c32-491f-95da-41cf3850be09\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.655200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwlzd\" (UniqueName: \"kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd\") pod \"e5bc496d-9c32-491f-95da-41cf3850be09\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.655227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content\") pod \"e5bc496d-9c32-491f-95da-41cf3850be09\" (UID: \"e5bc496d-9c32-491f-95da-41cf3850be09\") " Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.657252 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities" (OuterVolumeSpecName: "utilities") pod "e5bc496d-9c32-491f-95da-41cf3850be09" (UID: "e5bc496d-9c32-491f-95da-41cf3850be09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.665603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd" (OuterVolumeSpecName: "kube-api-access-kwlzd") pod "e5bc496d-9c32-491f-95da-41cf3850be09" (UID: "e5bc496d-9c32-491f-95da-41cf3850be09"). InnerVolumeSpecName "kube-api-access-kwlzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.719543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5bc496d-9c32-491f-95da-41cf3850be09" (UID: "e5bc496d-9c32-491f-95da-41cf3850be09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.756859 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.756896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwlzd\" (UniqueName: \"kubernetes.io/projected/e5bc496d-9c32-491f-95da-41cf3850be09-kube-api-access-kwlzd\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:32 crc kubenswrapper[4764]: I1203 23:44:32.756906 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bc496d-9c32-491f-95da-41cf3850be09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.081183 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.261454 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content\") pod \"d1fde235-cf99-4964-ba2e-df36c34906b5\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.261621 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvn7\" (UniqueName: \"kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7\") pod \"d1fde235-cf99-4964-ba2e-df36c34906b5\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.261663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities\") pod \"d1fde235-cf99-4964-ba2e-df36c34906b5\" (UID: \"d1fde235-cf99-4964-ba2e-df36c34906b5\") " Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.263041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities" (OuterVolumeSpecName: "utilities") pod "d1fde235-cf99-4964-ba2e-df36c34906b5" (UID: "d1fde235-cf99-4964-ba2e-df36c34906b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.266157 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7" (OuterVolumeSpecName: "kube-api-access-tnvn7") pod "d1fde235-cf99-4964-ba2e-df36c34906b5" (UID: "d1fde235-cf99-4964-ba2e-df36c34906b5"). InnerVolumeSpecName "kube-api-access-tnvn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.316459 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1fde235-cf99-4964-ba2e-df36c34906b5" (UID: "d1fde235-cf99-4964-ba2e-df36c34906b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.363606 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvn7\" (UniqueName: \"kubernetes.io/projected/d1fde235-cf99-4964-ba2e-df36c34906b5-kube-api-access-tnvn7\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.363661 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.363683 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1fde235-cf99-4964-ba2e-df36c34906b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.463268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zwfxp" event={"ID":"d1fde235-cf99-4964-ba2e-df36c34906b5","Type":"ContainerDied","Data":"53e63d0fa266ef727c79fe5522f9d3c49c436d6cb91b416bc8a1b64b56b197f3"} Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.463316 4764 scope.go:117] "RemoveContainer" containerID="ff66177fa95dd16a987c7cec6fe47c87b7e4c29feca2d33c0d60fcfd1a240da2" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.463438 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zwfxp" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.466468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerStarted","Data":"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b"} Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.487630 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gbjlz" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.487621 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerStarted","Data":"5196c7456fc8510942642cabb64ef1489895ef9ab749a596f4a9513709d585fd"} Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.489493 4764 scope.go:117] "RemoveContainer" containerID="694112cfbd7101f48208aefb00022be1442e644e55909d3abcd41562bb52557f" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.507112 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4dlcs" podStartSLOduration=4.012192908 podStartE2EDuration="1m6.507096517s" podCreationTimestamp="2025-12-03 23:43:27 +0000 UTC" firstStartedPulling="2025-12-03 23:43:29.658919146 +0000 UTC m=+145.420243557" lastFinishedPulling="2025-12-03 23:44:32.153822725 +0000 UTC m=+207.915147166" observedRunningTime="2025-12-03 23:44:33.501848747 +0000 UTC m=+209.263173158" watchObservedRunningTime="2025-12-03 23:44:33.507096517 +0000 UTC m=+209.268420928" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.521114 4764 scope.go:117] "RemoveContainer" containerID="d28d8ac0ba390876af5127f46e71ac9e07451d5c46e79c1311c749c86ebe3ddf" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.532011 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zz8jv" podStartSLOduration=3.164718799 podStartE2EDuration="1m3.531993449s" podCreationTimestamp="2025-12-03 23:43:30 +0000 UTC" firstStartedPulling="2025-12-03 23:43:31.852386495 +0000 UTC m=+147.613710906" lastFinishedPulling="2025-12-03 23:44:32.219661135 +0000 UTC m=+207.980985556" observedRunningTime="2025-12-03 23:44:33.530431537 +0000 UTC m=+209.291756018" watchObservedRunningTime="2025-12-03 23:44:33.531993449 +0000 UTC m=+209.293317860" Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.557044 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.563639 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zwfxp"] Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.572051 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.578926 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gbjlz"] Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.943696 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:44:33 crc kubenswrapper[4764]: I1203 23:44:33.943927 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d7jhj" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="registry-server" containerID="cri-o://be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23" gracePeriod=2 Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.148304 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.420922 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.495333 4764 generic.go:334] "Generic (PLEG): container finished" podID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerID="be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23" exitCode=0 Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.495411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerDied","Data":"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23"} Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.495447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7jhj" event={"ID":"54783f7d-0533-48d7-b80a-cc9e7941ebaf","Type":"ContainerDied","Data":"9405ece50c5fd33927258e287d115bf77df0fb4a1cd02afaac8716a4d9482c10"} Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.495473 4764 scope.go:117] "RemoveContainer" containerID="be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.495619 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7jhj" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.497584 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z4gdg" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="registry-server" containerID="cri-o://1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5" gracePeriod=2 Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.513241 4764 scope.go:117] "RemoveContainer" containerID="f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.529867 4764 scope.go:117] "RemoveContainer" containerID="069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.543863 4764 scope.go:117] "RemoveContainer" containerID="be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23" Dec 03 23:44:34 crc kubenswrapper[4764]: E1203 23:44:34.544548 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23\": container with ID starting with be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23 not found: ID does not exist" containerID="be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.544601 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23"} err="failed to get container status \"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23\": rpc error: code = NotFound desc = could not find container \"be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23\": container with ID starting with be59a598aa678ab0e46e7ba6c8f6bf8680b7e0d66e8885365bbff45326343f23 not found: ID does not exist" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.544666 4764 scope.go:117] "RemoveContainer" containerID="f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6" Dec 03 23:44:34 crc kubenswrapper[4764]: E1203 23:44:34.545101 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6\": container with ID starting with f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6 not found: ID does not exist" containerID="f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.545131 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6"} err="failed to get container status \"f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6\": rpc error: code = NotFound desc = could not find container \"f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6\": container with ID starting with f0b710ca954c5740214cd969ce4caf191d063dff0ade8a3c8a14190cd923c1b6 not found: ID does not exist" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.545152 4764 scope.go:117] "RemoveContainer" containerID="069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775" Dec 03 23:44:34 crc kubenswrapper[4764]: E1203 23:44:34.545488 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775\": container with ID starting with 069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775 not found: ID does not exist" containerID="069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.545511 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775"} err="failed to get container status \"069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775\": rpc error: code = NotFound desc = could not find container \"069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775\": container with ID starting with 069e59f223d2746a3e6a4c8697b839b0f7810d6d0f907478616109b739e39775 not found: ID does not exist" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.550752 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" path="/var/lib/kubelet/pods/d1fde235-cf99-4964-ba2e-df36c34906b5/volumes" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.551355 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" path="/var/lib/kubelet/pods/e5bc496d-9c32-491f-95da-41cf3850be09/volumes" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.579510 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jctsc\" (UniqueName: \"kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc\") pod \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.579579 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content\") pod \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.579620 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities\") pod \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\" (UID: \"54783f7d-0533-48d7-b80a-cc9e7941ebaf\") " Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.580389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities" (OuterVolumeSpecName: "utilities") pod "54783f7d-0533-48d7-b80a-cc9e7941ebaf" (UID: "54783f7d-0533-48d7-b80a-cc9e7941ebaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.580538 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.582636 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc" (OuterVolumeSpecName: "kube-api-access-jctsc") pod "54783f7d-0533-48d7-b80a-cc9e7941ebaf" (UID: "54783f7d-0533-48d7-b80a-cc9e7941ebaf"). InnerVolumeSpecName "kube-api-access-jctsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.596431 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54783f7d-0533-48d7-b80a-cc9e7941ebaf" (UID: "54783f7d-0533-48d7-b80a-cc9e7941ebaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.682039 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jctsc\" (UniqueName: \"kubernetes.io/projected/54783f7d-0533-48d7-b80a-cc9e7941ebaf-kube-api-access-jctsc\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.682087 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54783f7d-0533-48d7-b80a-cc9e7941ebaf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.833195 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.836834 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7jhj"] Dec 03 23:44:34 crc kubenswrapper[4764]: I1203 23:44:34.904161 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.086436 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwn5r\" (UniqueName: \"kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r\") pod \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.086863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content\") pod \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.087029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities\") pod \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\" (UID: \"a8a2e712-9c08-4688-905c-4aa0af0a2dab\") " Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.087945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities" (OuterVolumeSpecName: "utilities") pod "a8a2e712-9c08-4688-905c-4aa0af0a2dab" (UID: "a8a2e712-9c08-4688-905c-4aa0af0a2dab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.093942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r" (OuterVolumeSpecName: "kube-api-access-pwn5r") pod "a8a2e712-9c08-4688-905c-4aa0af0a2dab" (UID: "a8a2e712-9c08-4688-905c-4aa0af0a2dab"). InnerVolumeSpecName "kube-api-access-pwn5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.189138 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.189187 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwn5r\" (UniqueName: \"kubernetes.io/projected/a8a2e712-9c08-4688-905c-4aa0af0a2dab-kube-api-access-pwn5r\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.276942 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a8a2e712-9c08-4688-905c-4aa0af0a2dab" (UID: "a8a2e712-9c08-4688-905c-4aa0af0a2dab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.290867 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8a2e712-9c08-4688-905c-4aa0af0a2dab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.506158 4764 generic.go:334] "Generic (PLEG): container finished" podID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerID="1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5" exitCode=0 Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.506285 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4gdg" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.506284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerDied","Data":"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5"} Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.506504 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4gdg" event={"ID":"a8a2e712-9c08-4688-905c-4aa0af0a2dab","Type":"ContainerDied","Data":"90d129e59472fc244b710da977c19f63750cb7e1cf40481720359c41815e8770"} Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.506565 4764 scope.go:117] "RemoveContainer" containerID="1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.529906 4764 scope.go:117] "RemoveContainer" containerID="1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.556917 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.560449 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z4gdg"] Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.571048 4764 scope.go:117] "RemoveContainer" containerID="b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.590450 4764 scope.go:117] "RemoveContainer" containerID="1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5" Dec 03 23:44:35 crc kubenswrapper[4764]: E1203 23:44:35.591170 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5\": container with ID starting with 1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5 not found: ID does not exist" containerID="1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.591207 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5"} err="failed to get container status \"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5\": rpc error: code = NotFound desc = could not find container \"1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5\": container with ID starting with 1c7b5bc9b9c28d54756aef48cdeb3c7f200d6b6d3c9397ebafc41b838479d0e5 not found: ID does not exist" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.591234 4764 scope.go:117] "RemoveContainer" containerID="1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2" Dec 03 23:44:35 crc kubenswrapper[4764]: E1203 23:44:35.591518 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2\": container with ID starting with 1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2 not found: ID does not exist" containerID="1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.591541 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2"} err="failed to get container status \"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2\": rpc error: code = NotFound desc = could not find container \"1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2\": container with ID starting with 1378cd78ddbfa5ec362cca0c810fb39ed460a05fc115c060764df84d86b03de2 not found: ID does not exist" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.591558 4764 scope.go:117] "RemoveContainer" containerID="b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f" Dec 03 23:44:35 crc kubenswrapper[4764]: E1203 23:44:35.591789 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f\": container with ID starting with b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f not found: ID does not exist" containerID="b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f" Dec 03 23:44:35 crc kubenswrapper[4764]: I1203 23:44:35.591815 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f"} err="failed to get container status \"b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f\": rpc error: code = NotFound desc = could not find container \"b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f\": container with ID starting with b46da9fb63ba7cf4340f943756269dd32b4cc699aeb323223d2b54af037b322f not found: ID does not exist" Dec 03 23:44:36 crc kubenswrapper[4764]: I1203 23:44:36.552317 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" path="/var/lib/kubelet/pods/54783f7d-0533-48d7-b80a-cc9e7941ebaf/volumes" Dec 03 23:44:36 crc kubenswrapper[4764]: I1203 23:44:36.554138 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" path="/var/lib/kubelet/pods/a8a2e712-9c08-4688-905c-4aa0af0a2dab/volumes" Dec 03 23:44:37 crc kubenswrapper[4764]: I1203 23:44:37.794470 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:44:37 crc kubenswrapper[4764]: I1203 23:44:37.794538 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:44:37 crc kubenswrapper[4764]: I1203 23:44:37.863504 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:44:38 crc kubenswrapper[4764]: I1203 23:44:38.033266 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:44:38 crc kubenswrapper[4764]: I1203 23:44:38.566479 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:44:40 crc kubenswrapper[4764]: I1203 23:44:40.939392 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:44:40 crc kubenswrapper[4764]: I1203 23:44:40.939845 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:44:40 crc kubenswrapper[4764]: I1203 23:44:40.998498 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:44:41 crc kubenswrapper[4764]: I1203 23:44:41.624193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:44:41 crc kubenswrapper[4764]: I1203 23:44:41.862526 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-cgmmp"] Dec 03 23:44:50 crc kubenswrapper[4764]: I1203 23:44:50.869481 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:44:50 crc kubenswrapper[4764]: I1203 23:44:50.870181 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:44:50 crc kubenswrapper[4764]: I1203 23:44:50.870242 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:44:50 crc kubenswrapper[4764]: I1203 23:44:50.871068 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:44:50 crc kubenswrapper[4764]: I1203 23:44:50.871164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c" gracePeriod=600 Dec 03 23:44:55 crc kubenswrapper[4764]: I1203 23:44:55.635983 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c" exitCode=0 Dec 03 23:44:55 crc kubenswrapper[4764]: I1203 23:44:55.636689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c"} Dec 03 23:44:55 crc kubenswrapper[4764]: I1203 23:44:55.636782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250"} Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.191921 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192169 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192184 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192196 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192204 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192217 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192222 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192231 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192237 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192245 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192250 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192258 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192263 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192271 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192278 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="extract-content" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192286 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192291 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192302 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192307 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192317 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192322 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="extract-utilities" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192331 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192336 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.192347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192353 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192436 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1fde235-cf99-4964-ba2e-df36c34906b5" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192446 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bc496d-9c32-491f-95da-41cf3850be09" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192454 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="54783f7d-0533-48d7-b80a-cc9e7941ebaf" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192462 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a2e712-9c08-4688-905c-4aa0af0a2dab" containerName="registry-server" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.192916 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193116 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63" gracePeriod=15 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193250 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193628 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74" gracePeriod=15 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193691 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3" gracePeriod=15 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193898 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d" gracePeriod=15 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.193969 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e" gracePeriod=15 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194257 4764 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194378 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194389 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194402 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194417 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194424 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194433 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194438 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194450 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194456 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.194463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194468 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194543 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194550 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194561 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194568 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.194576 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.223775 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.304520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305170 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305220 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.305295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406681 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406773 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406852 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406866 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406944 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406991 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.406920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.407046 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.407157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.407327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.520382 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:44:56 crc kubenswrapper[4764]: W1203 23:44:56.555653 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-28def350ad471ffafad95352cb37c3e8a4fb0e3d297d11c6ab412bb93307ed34 WatchSource:0}: Error finding container 28def350ad471ffafad95352cb37c3e8a4fb0e3d297d11c6ab412bb93307ed34: Status 404 returned error can't find the container with id 28def350ad471ffafad95352cb37c3e8a4fb0e3d297d11c6ab412bb93307ed34 Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.561758 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dd93fb025dc8f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 23:44:56.560540815 +0000 UTC m=+232.321865296,LastTimestamp:2025-12-03 23:44:56.560540815 +0000 UTC m=+232.321865296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.645438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"28def350ad471ffafad95352cb37c3e8a4fb0e3d297d11c6ab412bb93307ed34"} Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.649591 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.650778 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74" exitCode=0 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.650801 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d" exitCode=0 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.650811 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e" exitCode=0 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.650824 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3" exitCode=2 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.652971 4764 generic.go:334] "Generic (PLEG): container finished" podID="ceb393b2-6803-4b90-8f7f-a1569931b27a" containerID="67e351f84402164969bb3ac1fee457cd5f412deefbb095cdada318099c26ae46" exitCode=0 Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.653058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ceb393b2-6803-4b90-8f7f-a1569931b27a","Type":"ContainerDied","Data":"67e351f84402164969bb3ac1fee457cd5f412deefbb095cdada318099c26ae46"} Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.653870 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.654208 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.938952 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.940006 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.940451 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.940878 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.941314 4764 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:56 crc kubenswrapper[4764]: I1203 23:44:56.941392 4764 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 23:44:56 crc kubenswrapper[4764]: E1203 23:44:56.941837 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Dec 03 23:44:57 crc kubenswrapper[4764]: E1203 23:44:57.142832 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Dec 03 23:44:57 crc kubenswrapper[4764]: E1203 23:44:57.544650 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.663333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783"} Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.664161 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.664579 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.979396 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.980273 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:57 crc kubenswrapper[4764]: I1203 23:44:57.980649 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034405 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock\") pod \"ceb393b2-6803-4b90-8f7f-a1569931b27a\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034476 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock" (OuterVolumeSpecName: "var-lock") pod "ceb393b2-6803-4b90-8f7f-a1569931b27a" (UID: "ceb393b2-6803-4b90-8f7f-a1569931b27a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access\") pod \"ceb393b2-6803-4b90-8f7f-a1569931b27a\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir\") pod \"ceb393b2-6803-4b90-8f7f-a1569931b27a\" (UID: \"ceb393b2-6803-4b90-8f7f-a1569931b27a\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034726 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ceb393b2-6803-4b90-8f7f-a1569931b27a" (UID: "ceb393b2-6803-4b90-8f7f-a1569931b27a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034957 4764 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.034976 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ceb393b2-6803-4b90-8f7f-a1569931b27a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.039429 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ceb393b2-6803-4b90-8f7f-a1569931b27a" (UID: "ceb393b2-6803-4b90-8f7f-a1569931b27a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.137485 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ceb393b2-6803-4b90-8f7f-a1569931b27a-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.345584 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.614955 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.616428 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.617057 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.617486 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.617950 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.643863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644066 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644240 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644473 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644782 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644852 4764 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.644909 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.672581 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.673597 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63" exitCode=0 Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.673702 4764 scope.go:117] "RemoveContainer" containerID="7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.673941 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.676366 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.676431 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ceb393b2-6803-4b90-8f7f-a1569931b27a","Type":"ContainerDied","Data":"552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6"} Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.676644 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="552fba2a0ea563b7924227df16cca62869ab9469574f60512b14c09a512fe9e6" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.681485 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.682140 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.683753 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.699004 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.699499 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.699999 4764 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.702308 4764 scope.go:117] "RemoveContainer" containerID="6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.718811 4764 scope.go:117] "RemoveContainer" containerID="6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.736331 4764 scope.go:117] "RemoveContainer" containerID="b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.759144 4764 scope.go:117] "RemoveContainer" containerID="c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.782649 4764 scope.go:117] "RemoveContainer" containerID="eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.811783 4764 scope.go:117] "RemoveContainer" containerID="7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.812327 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\": container with ID starting with 7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74 not found: ID does not exist" containerID="7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.812366 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74"} err="failed to get container status \"7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\": rpc error: code = NotFound desc = could not find container \"7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74\": container with ID starting with 7439f4ff169e44bb7fddae13187acedf1dc0c342c19c059f4fbb86c826527a74 not found: ID does not exist" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.812395 4764 scope.go:117] "RemoveContainer" containerID="6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.812741 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\": container with ID starting with 6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d not found: ID does not exist" containerID="6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.812772 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d"} err="failed to get container status \"6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\": rpc error: code = NotFound desc = could not find container \"6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d\": container with ID starting with 6877a8901d303dfd24e5400e7765757f2c0b752d6713a27a802e9042bd5e9e7d not found: ID does not exist" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.812791 4764 scope.go:117] "RemoveContainer" containerID="6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.813073 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\": container with ID starting with 6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e not found: ID does not exist" containerID="6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.813102 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e"} err="failed to get container status \"6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\": rpc error: code = NotFound desc = could not find container \"6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e\": container with ID starting with 6d412735366e9042dbcb925217a4b0122a205d060e233cc97c8d2fa22ef6af6e not found: ID does not exist" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.813206 4764 scope.go:117] "RemoveContainer" containerID="b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.813553 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\": container with ID starting with b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3 not found: ID does not exist" containerID="b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.813596 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3"} err="failed to get container status \"b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\": rpc error: code = NotFound desc = could not find container \"b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3\": container with ID starting with b3b9b845b480c1e1b4b04f204072e56d8b15234d67c2f84cce11b125bc827af3 not found: ID does not exist" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.813620 4764 scope.go:117] "RemoveContainer" containerID="c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.813965 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\": container with ID starting with c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63 not found: ID does not exist" containerID="c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.813990 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63"} err="failed to get container status \"c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\": rpc error: code = NotFound desc = could not find container \"c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63\": container with ID starting with c3e41330f757291f5158e3028d632073977542167398016bc6a7d121bca91b63 not found: ID does not exist" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.814006 4764 scope.go:117] "RemoveContainer" containerID="eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d" Dec 03 23:44:58 crc kubenswrapper[4764]: E1203 23:44:58.814399 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\": container with ID starting with eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d not found: ID does not exist" containerID="eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d" Dec 03 23:44:58 crc kubenswrapper[4764]: I1203 23:44:58.814454 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d"} err="failed to get container status \"eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\": rpc error: code = NotFound desc = could not find container \"eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d\": container with ID starting with eaa030897c1b86252acc1101d6d5380dbb87feb4639b7c1f73a1ac4599306b2d not found: ID does not exist" Dec 03 23:44:59 crc kubenswrapper[4764]: E1203 23:44:59.947220 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="3.2s" Dec 03 23:45:00 crc kubenswrapper[4764]: I1203 23:45:00.557422 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 23:45:03 crc kubenswrapper[4764]: E1203 23:45:03.148492 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="6.4s" Dec 03 23:45:03 crc kubenswrapper[4764]: E1203 23:45:03.424807 4764 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dd93fb025dc8f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 23:44:56.560540815 +0000 UTC m=+232.321865296,LastTimestamp:2025-12-03 23:44:56.560540815 +0000 UTC m=+232.321865296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 23:45:04 crc kubenswrapper[4764]: I1203 23:45:04.551681 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:04 crc kubenswrapper[4764]: I1203 23:45:04.552144 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:06 crc kubenswrapper[4764]: I1203 23:45:06.886652 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerName="oauth-openshift" containerID="cri-o://3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960" gracePeriod=15 Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.279315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.280227 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.280450 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.280689 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460258 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460303 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460351 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460378 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460460 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460484 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxzk5\" (UniqueName: \"kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460522 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session\") pod \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\" (UID: \"28c57be9-2500-4944-abed-6fe2e4e2dd0d\") " Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.460981 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.461885 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.461906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.461884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.462917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.468019 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.468472 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.468681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.469598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.472547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5" (OuterVolumeSpecName: "kube-api-access-jxzk5") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "kube-api-access-jxzk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.473242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.473272 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.473806 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.474406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "28c57be9-2500-4944-abed-6fe2e4e2dd0d" (UID: "28c57be9-2500-4944-abed-6fe2e4e2dd0d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562599 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562654 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxzk5\" (UniqueName: \"kubernetes.io/projected/28c57be9-2500-4944-abed-6fe2e4e2dd0d-kube-api-access-jxzk5\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562675 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562697 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562756 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562775 4764 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562795 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562816 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562835 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562853 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562873 4764 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/28c57be9-2500-4944-abed-6fe2e4e2dd0d-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562891 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562908 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.562926 4764 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/28c57be9-2500-4944-abed-6fe2e4e2dd0d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.750925 4764 generic.go:334] "Generic (PLEG): container finished" podID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerID="3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960" exitCode=0 Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.751003 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" event={"ID":"28c57be9-2500-4944-abed-6fe2e4e2dd0d","Type":"ContainerDied","Data":"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960"} Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.751057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" event={"ID":"28c57be9-2500-4944-abed-6fe2e4e2dd0d","Type":"ContainerDied","Data":"9b82c055e222596b7ff55f2ee550ae1d19c1fc97d281fe5d115833d167ee5196"} Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.751065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.751086 4764 scope.go:117] "RemoveContainer" containerID="3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.752219 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.753233 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.753760 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.780571 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.781295 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.782091 4764 scope.go:117] "RemoveContainer" containerID="3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.782119 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:07 crc kubenswrapper[4764]: E1203 23:45:07.782710 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960\": container with ID starting with 3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960 not found: ID does not exist" containerID="3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960" Dec 03 23:45:07 crc kubenswrapper[4764]: I1203 23:45:07.782790 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960"} err="failed to get container status \"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960\": rpc error: code = NotFound desc = could not find container \"3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960\": container with ID starting with 3b436750327b938d40b3d707cabb17298a53d418d1001f2e6072e7a8c1d00960 not found: ID does not exist" Dec 03 23:45:09 crc kubenswrapper[4764]: E1203 23:45:09.550415 4764 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="7s" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.545375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.546501 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.547626 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.548102 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.562623 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.562669 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:10 crc kubenswrapper[4764]: E1203 23:45:10.563249 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.563868 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:10 crc kubenswrapper[4764]: W1203 23:45:10.585633 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-13e0672acac2aa973f36853644092255b4fdd5357237e15af3ef98260ace0775 WatchSource:0}: Error finding container 13e0672acac2aa973f36853644092255b4fdd5357237e15af3ef98260ace0775: Status 404 returned error can't find the container with id 13e0672acac2aa973f36853644092255b4fdd5357237e15af3ef98260ace0775 Dec 03 23:45:10 crc kubenswrapper[4764]: I1203 23:45:10.777247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"13e0672acac2aa973f36853644092255b4fdd5357237e15af3ef98260ace0775"} Dec 03 23:45:11 crc kubenswrapper[4764]: E1203 23:45:11.646629 4764 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.13:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" volumeName="registry-storage" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.787944 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.788822 4764 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c" exitCode=1 Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.788932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c"} Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.789591 4764 scope.go:117] "RemoveContainer" containerID="6dacd2c135270cbf463ce970e427ebc7b121f44ae52c821a6751d9146d66e21c" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.790234 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.791173 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.794999 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.796064 4764 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3acf29526e57418a94d1f1050b929c8a4f6f83f74932732ce461757ba54462f7" exitCode=0 Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.796107 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3acf29526e57418a94d1f1050b929c8a4f6f83f74932732ce461757ba54462f7"} Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.796167 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.796649 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.796680 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.797173 4764 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: E1203 23:45:11.797527 4764 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.797696 4764 status_manager.go:851] "Failed to get status for pod" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" pod="openshift-authentication/oauth-openshift-558db77b4-cgmmp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-cgmmp\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.798067 4764 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:11 crc kubenswrapper[4764]: I1203 23:45:11.798618 4764 status_manager.go:851] "Failed to get status for pod" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.13:6443: connect: connection refused" Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.190298 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.805688 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.806019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f124f5759ae77506138ce9d752aebc05fbfa2194e63368cc58851d79049eb0d0"} Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.808935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8b513d2c476e4a98b05edb578498641ada29bad7f0714211e14afcf917037d80"} Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.808964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe4b0e3154418512296d1280d5c3346b2cb09fb3a2ab5e92bcb11e54b587ba74"} Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.808973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"614f11903ad4a15048f7fee9891d1439505610c6b826e15a79a6c59b1f884108"} Dec 03 23:45:12 crc kubenswrapper[4764]: I1203 23:45:12.808983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e8b752be0e85732345023798adce3f4d3f23df354583d685bc857f36f09f2ff0"} Dec 03 23:45:13 crc kubenswrapper[4764]: I1203 23:45:13.823590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"64e703c37d4a1060edd46db4f1a1b7f17b069b648816eebcc1e42d2daddfa493"} Dec 03 23:45:13 crc kubenswrapper[4764]: I1203 23:45:13.825122 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:13 crc kubenswrapper[4764]: I1203 23:45:13.825231 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:14 crc kubenswrapper[4764]: I1203 23:45:14.346223 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:45:14 crc kubenswrapper[4764]: I1203 23:45:14.394019 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:45:14 crc kubenswrapper[4764]: I1203 23:45:14.402294 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:45:15 crc kubenswrapper[4764]: I1203 23:45:15.564318 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:15 crc kubenswrapper[4764]: I1203 23:45:15.564366 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:15 crc kubenswrapper[4764]: I1203 23:45:15.568844 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:18 crc kubenswrapper[4764]: I1203 23:45:18.835065 4764 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:19 crc kubenswrapper[4764]: I1203 23:45:19.858142 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:19 crc kubenswrapper[4764]: I1203 23:45:19.858655 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:19 crc kubenswrapper[4764]: I1203 23:45:19.858491 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:19 crc kubenswrapper[4764]: I1203 23:45:19.866287 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:19 crc kubenswrapper[4764]: I1203 23:45:19.869523 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dfc0152c-7eb9-42c3-95ef-715c5c981077" Dec 03 23:45:20 crc kubenswrapper[4764]: I1203 23:45:20.864758 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:20 crc kubenswrapper[4764]: I1203 23:45:20.864807 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:21 crc kubenswrapper[4764]: I1203 23:45:21.871592 4764 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:21 crc kubenswrapper[4764]: I1203 23:45:21.871632 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bff4c6e5-5fff-40e5-babc-02077d45f75a" Dec 03 23:45:24 crc kubenswrapper[4764]: I1203 23:45:24.351996 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 23:45:24 crc kubenswrapper[4764]: I1203 23:45:24.574763 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="dfc0152c-7eb9-42c3-95ef-715c5c981077" Dec 03 23:45:28 crc kubenswrapper[4764]: I1203 23:45:28.113919 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.158094 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.172373 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.291112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.426900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.680981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.762548 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 23:45:29 crc kubenswrapper[4764]: I1203 23:45:29.962189 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.214121 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.316901 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.492083 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.494022 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.616292 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.620483 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 23:45:30 crc kubenswrapper[4764]: I1203 23:45:30.826186 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.204830 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.220446 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.268109 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.414624 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.440939 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.474616 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.689673 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 23:45:31 crc kubenswrapper[4764]: I1203 23:45:31.973266 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.083480 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.102795 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.147565 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.276919 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.301464 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.303218 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.313470 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.318921 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.345947 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.469816 4764 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.473826 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.473802944 podStartE2EDuration="36.473802944s" podCreationTimestamp="2025-12-03 23:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:45:18.619557569 +0000 UTC m=+254.380882000" watchObservedRunningTime="2025-12-03 23:45:32.473802944 +0000 UTC m=+268.235127385" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.478304 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-cgmmp"] Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.478383 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.484398 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.505354 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.505327323 podStartE2EDuration="14.505327323s" podCreationTimestamp="2025-12-03 23:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:45:32.50030954 +0000 UTC m=+268.261633961" watchObservedRunningTime="2025-12-03 23:45:32.505327323 +0000 UTC m=+268.266651744" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.554460 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" path="/var/lib/kubelet/pods/28c57be9-2500-4944-abed-6fe2e4e2dd0d/volumes" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.586427 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.633598 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.813328 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.818803 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.851490 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.910347 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 23:45:32 crc kubenswrapper[4764]: I1203 23:45:32.941940 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.092588 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.102868 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.144319 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.240201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.243533 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.258296 4764 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.258858 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.271581 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.287857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.466854 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.479119 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.515467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.519446 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.525282 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.540091 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.566503 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.624408 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.655283 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.709419 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 23:45:33 crc kubenswrapper[4764]: I1203 23:45:33.742180 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.011313 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.074958 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.119460 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.124044 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.290860 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.291202 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.300169 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.340148 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.357169 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.369057 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.375112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.456489 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.459327 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.467733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.489801 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.497097 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.673557 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.732949 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.823136 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.844895 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 23:45:34 crc kubenswrapper[4764]: I1203 23:45:34.936604 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.092658 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.297388 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.311245 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.340599 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.345748 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.357344 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.366562 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.507991 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.513435 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.683763 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.685467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.768756 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.836985 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.856785 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.868067 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.897381 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 23:45:35 crc kubenswrapper[4764]: I1203 23:45:35.978354 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.030513 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.034307 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.056473 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.144525 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.182626 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.240326 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.250129 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.254162 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.328658 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.402983 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.564261 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.614021 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.620877 4764 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.645784 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.771901 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.849647 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.861543 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.870688 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 23:45:36 crc kubenswrapper[4764]: I1203 23:45:36.983241 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.179010 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.299614 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.303170 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.336593 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.399322 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.401493 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.497740 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.517030 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.599616 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.600020 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.688288 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.739070 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.742411 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.762916 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.841107 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.854223 4764 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.895351 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 23:45:37 crc kubenswrapper[4764]: I1203 23:45:37.960301 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.026017 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.063705 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.113691 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.164365 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.166398 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.177371 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.187861 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.253196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.270058 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.317625 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.347444 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.362034 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.403469 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.488336 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.597263 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.621475 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.695785 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.729251 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.745899 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.811389 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.846356 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.858936 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 23:45:38 crc kubenswrapper[4764]: I1203 23:45:38.913870 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.014648 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.018227 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.163944 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.168660 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.169817 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.203403 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.285045 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.330678 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.338660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.423789 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.434253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.449130 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.469004 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.541022 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.732982 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.928135 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.967769 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 23:45:39 crc kubenswrapper[4764]: I1203 23:45:39.994952 4764 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.001634 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039174 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039341 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2"] Dec 03 23:45:40 crc kubenswrapper[4764]: E1203 23:45:40.039678 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" containerName="installer" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039731 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" containerName="installer" Dec 03 23:45:40 crc kubenswrapper[4764]: E1203 23:45:40.039748 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerName="oauth-openshift" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039759 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerName="oauth-openshift" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039898 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c57be9-2500-4944-abed-6fe2e4e2dd0d" containerName="oauth-openshift" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.039926 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb393b2-6803-4b90-8f7f-a1569931b27a" containerName="installer" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.040493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.043438 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6476cb8788-kxsk9"] Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.043749 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.043775 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.044275 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.046320 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.050098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.050278 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.050441 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.050800 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.050931 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.051134 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.051239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.051240 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.051341 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.051366 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.052299 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.063343 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6476cb8788-kxsk9"] Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.063838 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.070835 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.071471 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2"] Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.086952 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.130040 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.176223 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.179638 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186230 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-login\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186393 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlh29\" (UniqueName: \"kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186789 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-session\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.186912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfqd7\" (UniqueName: \"kubernetes.io/projected/b014c95a-3376-4f15-81e3-28fcb2ec59fa-kube-api-access-lfqd7\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-error\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-policies\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187481 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-dir\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.187896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.188010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.188132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.188254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.225630 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.289592 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-dir\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-dir\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290393 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-login\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290679 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlh29\" (UniqueName: \"kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290898 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.290984 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-session\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfqd7\" (UniqueName: \"kubernetes.io/projected/b014c95a-3376-4f15-81e3-28fcb2ec59fa-kube-api-access-lfqd7\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291097 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-error\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-policies\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.291386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.292180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-service-ca\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.292458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.294815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.295051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.295219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b014c95a-3376-4f15-81e3-28fcb2ec59fa-audit-policies\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.298480 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.299123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-router-certs\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.303363 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-session\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.303553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-error\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.303694 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.305226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.309441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.314760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.316372 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b014c95a-3376-4f15-81e3-28fcb2ec59fa-v4-0-config-user-template-login\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.327679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfqd7\" (UniqueName: \"kubernetes.io/projected/b014c95a-3376-4f15-81e3-28fcb2ec59fa-kube-api-access-lfqd7\") pod \"oauth-openshift-6476cb8788-kxsk9\" (UID: \"b014c95a-3376-4f15-81e3-28fcb2ec59fa\") " pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.328401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlh29\" (UniqueName: \"kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29\") pod \"collect-profiles-29413425-n69h2\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.373311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.385263 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.528778 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.542857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.575010 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2"] Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.603969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.625220 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6476cb8788-kxsk9"] Dec 03 23:45:40 crc kubenswrapper[4764]: W1203 23:45:40.627646 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb014c95a_3376_4f15_81e3_28fcb2ec59fa.slice/crio-f433c95a3cf03179ecdee37f45196c4eb486566553074b1083fdc1c4f81c0429 WatchSource:0}: Error finding container f433c95a3cf03179ecdee37f45196c4eb486566553074b1083fdc1c4f81c0429: Status 404 returned error can't find the container with id f433c95a3cf03179ecdee37f45196c4eb486566553074b1083fdc1c4f81c0429 Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.694265 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.723933 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.855337 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.884049 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.918149 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.918408 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.978516 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.989933 4764 generic.go:334] "Generic (PLEG): container finished" podID="1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" containerID="4084c667c523fa69892fc1a0a37c04c4401ec8a5b7b12e4ab4bd0347c10ceb2b" exitCode=0 Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.990048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" event={"ID":"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca","Type":"ContainerDied","Data":"4084c667c523fa69892fc1a0a37c04c4401ec8a5b7b12e4ab4bd0347c10ceb2b"} Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.990097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" event={"ID":"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca","Type":"ContainerStarted","Data":"119dc74af89112b864eb7f3580d9f72a33406331a7569608be10b40fbafab312"} Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.993024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" event={"ID":"b014c95a-3376-4f15-81e3-28fcb2ec59fa","Type":"ContainerStarted","Data":"d4413bed16d235710f4febdaad15252c4909db6707d2da8787af625a51c2b2a6"} Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.993216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" event={"ID":"b014c95a-3376-4f15-81e3-28fcb2ec59fa","Type":"ContainerStarted","Data":"f433c95a3cf03179ecdee37f45196c4eb486566553074b1083fdc1c4f81c0429"} Dec 03 23:45:40 crc kubenswrapper[4764]: I1203 23:45:40.993582 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.012003 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.027490 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.035935 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" podStartSLOduration=60.03590456 podStartE2EDuration="1m0.03590456s" podCreationTimestamp="2025-12-03 23:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:45:41.034541264 +0000 UTC m=+276.795865675" watchObservedRunningTime="2025-12-03 23:45:41.03590456 +0000 UTC m=+276.797229011" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.065575 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.134019 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.160089 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.178615 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.299028 4764 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.299550 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783" gracePeriod=5 Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.302790 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6476cb8788-kxsk9" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.378432 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.490189 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.530463 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.556835 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.795893 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.830785 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.883524 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 23:45:41 crc kubenswrapper[4764]: I1203 23:45:41.911397 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.040100 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.131209 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.234280 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.255935 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.365651 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.396280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.452924 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.520609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlh29\" (UniqueName: \"kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29\") pod \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.520726 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume\") pod \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.520756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume\") pod \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\" (UID: \"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca\") " Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.521816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" (UID: "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.526761 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" (UID: "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.526793 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29" (OuterVolumeSpecName: "kube-api-access-vlh29") pod "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" (UID: "1e04e7e7-c3a8-45bb-834b-b35c6e74bcca"). InnerVolumeSpecName "kube-api-access-vlh29". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.573990 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.622195 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.622271 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.622300 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlh29\" (UniqueName: \"kubernetes.io/projected/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca-kube-api-access-vlh29\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:42 crc kubenswrapper[4764]: I1203 23:45:42.699808 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.003381 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.008323 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.008347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2" event={"ID":"1e04e7e7-c3a8-45bb-834b-b35c6e74bcca","Type":"ContainerDied","Data":"119dc74af89112b864eb7f3580d9f72a33406331a7569608be10b40fbafab312"} Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.008417 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="119dc74af89112b864eb7f3580d9f72a33406331a7569608be10b40fbafab312" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.037660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.053585 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.121182 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.262178 4764 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.401257 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.465303 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.651313 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.660385 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.788889 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.809413 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 23:45:43 crc kubenswrapper[4764]: I1203 23:45:43.837329 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 23:45:44 crc kubenswrapper[4764]: I1203 23:45:44.048840 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 23:45:44 crc kubenswrapper[4764]: I1203 23:45:44.195752 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 23:45:44 crc kubenswrapper[4764]: I1203 23:45:44.567055 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 23:45:44 crc kubenswrapper[4764]: I1203 23:45:44.682296 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 23:45:44 crc kubenswrapper[4764]: I1203 23:45:44.708831 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 23:45:45 crc kubenswrapper[4764]: I1203 23:45:45.116388 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 23:45:45 crc kubenswrapper[4764]: I1203 23:45:45.501920 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 23:45:46 crc kubenswrapper[4764]: I1203 23:45:46.888015 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 23:45:46 crc kubenswrapper[4764]: I1203 23:45:46.888124 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.001086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.001218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.001213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002242 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002325 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002429 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002560 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.002959 4764 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.003012 4764 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.003191 4764 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.003221 4764 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.010487 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.039652 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.039781 4764 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783" exitCode=137 Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.039864 4764 scope.go:117] "RemoveContainer" containerID="c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.039867 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.066478 4764 scope.go:117] "RemoveContainer" containerID="c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783" Dec 03 23:45:47 crc kubenswrapper[4764]: E1203 23:45:47.067147 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783\": container with ID starting with c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783 not found: ID does not exist" containerID="c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.067206 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783"} err="failed to get container status \"c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783\": rpc error: code = NotFound desc = could not find container \"c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783\": container with ID starting with c5016a42bed8fd85eb0cc387baf1ea8267db4fe9f6071f90cf6205ebe954a783 not found: ID does not exist" Dec 03 23:45:47 crc kubenswrapper[4764]: I1203 23:45:47.104520 4764 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.558918 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.559677 4764 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.574743 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.574803 4764 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5740871e-7f71-4ff1-9a4e-9301eb776cb7" Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.580839 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 23:45:48 crc kubenswrapper[4764]: I1203 23:45:48.580894 4764 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="5740871e-7f71-4ff1-9a4e-9301eb776cb7" Dec 03 23:45:59 crc kubenswrapper[4764]: I1203 23:45:59.115884 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9415d09-8034-4627-80dc-ae731d9f466e" containerID="96b52f286694278c724a4e1484a8a84187fcc67c763f8c345700208209105ba3" exitCode=0 Dec 03 23:45:59 crc kubenswrapper[4764]: I1203 23:45:59.115975 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerDied","Data":"96b52f286694278c724a4e1484a8a84187fcc67c763f8c345700208209105ba3"} Dec 03 23:45:59 crc kubenswrapper[4764]: I1203 23:45:59.117517 4764 scope.go:117] "RemoveContainer" containerID="96b52f286694278c724a4e1484a8a84187fcc67c763f8c345700208209105ba3" Dec 03 23:46:00 crc kubenswrapper[4764]: I1203 23:46:00.122574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerStarted","Data":"687146a8cef817d1fd9d607dd124b2fe1cab41f26deb46632c372164215cf28f"} Dec 03 23:46:00 crc kubenswrapper[4764]: I1203 23:46:00.123166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:46:00 crc kubenswrapper[4764]: I1203 23:46:00.124013 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:46:02 crc kubenswrapper[4764]: I1203 23:46:02.200488 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 23:46:03 crc kubenswrapper[4764]: I1203 23:46:03.619703 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 23:46:03 crc kubenswrapper[4764]: I1203 23:46:03.667085 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:46:03 crc kubenswrapper[4764]: I1203 23:46:03.667393 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" containerName="controller-manager" containerID="cri-o://0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8" gracePeriod=30 Dec 03 23:46:03 crc kubenswrapper[4764]: I1203 23:46:03.774161 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:46:03 crc kubenswrapper[4764]: I1203 23:46:03.774388 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" podUID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" containerName="route-controller-manager" containerID="cri-o://24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5" gracePeriod=30 Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.077606 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.138088 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca\") pod \"a44e8e46-19c5-4242-8186-12ec04167e59\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.138151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles\") pod \"a44e8e46-19c5-4242-8186-12ec04167e59\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.138185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config\") pod \"a44e8e46-19c5-4242-8186-12ec04167e59\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.138257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert\") pod \"a44e8e46-19c5-4242-8186-12ec04167e59\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.138287 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qq8j\" (UniqueName: \"kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j\") pod \"a44e8e46-19c5-4242-8186-12ec04167e59\" (UID: \"a44e8e46-19c5-4242-8186-12ec04167e59\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.139407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca" (OuterVolumeSpecName: "client-ca") pod "a44e8e46-19c5-4242-8186-12ec04167e59" (UID: "a44e8e46-19c5-4242-8186-12ec04167e59"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.139434 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config" (OuterVolumeSpecName: "config") pod "a44e8e46-19c5-4242-8186-12ec04167e59" (UID: "a44e8e46-19c5-4242-8186-12ec04167e59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.139848 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a44e8e46-19c5-4242-8186-12ec04167e59" (UID: "a44e8e46-19c5-4242-8186-12ec04167e59"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.145066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a44e8e46-19c5-4242-8186-12ec04167e59" (UID: "a44e8e46-19c5-4242-8186-12ec04167e59"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.146333 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j" (OuterVolumeSpecName: "kube-api-access-6qq8j") pod "a44e8e46-19c5-4242-8186-12ec04167e59" (UID: "a44e8e46-19c5-4242-8186-12ec04167e59"). InnerVolumeSpecName "kube-api-access-6qq8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149124 4764 generic.go:334] "Generic (PLEG): container finished" podID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" containerID="24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5" exitCode=0 Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149195 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" event={"ID":"6590d259-1d9c-41e2-b070-5e7a1fa53d34","Type":"ContainerDied","Data":"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5"} Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149224 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" event={"ID":"6590d259-1d9c-41e2-b070-5e7a1fa53d34","Type":"ContainerDied","Data":"3e7a95eb0c7c2d3d30c41e2fcdbecb1d7689d7cc86259105d2329bbc5d42d26d"} Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149245 4764 scope.go:117] "RemoveContainer" containerID="24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.149251 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.152695 4764 generic.go:334] "Generic (PLEG): container finished" podID="a44e8e46-19c5-4242-8186-12ec04167e59" containerID="0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8" exitCode=0 Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.152748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" event={"ID":"a44e8e46-19c5-4242-8186-12ec04167e59","Type":"ContainerDied","Data":"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8"} Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.152770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" event={"ID":"a44e8e46-19c5-4242-8186-12ec04167e59","Type":"ContainerDied","Data":"869d1daf7ebda525ee48ddfef0b28e75b085251bf882459bf88d630e49ac283f"} Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.152821 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bn7zt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.189571 4764 scope.go:117] "RemoveContainer" containerID="24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.189956 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.190120 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5\": container with ID starting with 24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5 not found: ID does not exist" containerID="24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.190177 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5"} err="failed to get container status \"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5\": rpc error: code = NotFound desc = could not find container \"24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5\": container with ID starting with 24187d4235f6ed4130cb53a565fc82e9caa826e6a97c57a13a3648ff051212d5 not found: ID does not exist" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.190209 4764 scope.go:117] "RemoveContainer" containerID="0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.194483 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bn7zt"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.205393 4764 scope.go:117] "RemoveContainer" containerID="0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8" Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.206049 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8\": container with ID starting with 0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8 not found: ID does not exist" containerID="0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.206088 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8"} err="failed to get container status \"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8\": rpc error: code = NotFound desc = could not find container \"0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8\": container with ID starting with 0879f79738dbec5796fe3aa68a3a3110c14f0f1aeabb62325dd374818907b2d8 not found: ID does not exist" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjz4g\" (UniqueName: \"kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g\") pod \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config\") pod \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert\") pod \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca\") pod \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\" (UID: \"6590d259-1d9c-41e2-b070-5e7a1fa53d34\") " Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qq8j\" (UniqueName: \"kubernetes.io/projected/a44e8e46-19c5-4242-8186-12ec04167e59-kube-api-access-6qq8j\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239850 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239862 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239874 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44e8e46-19c5-4242-8186-12ec04167e59-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.239885 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44e8e46-19c5-4242-8186-12ec04167e59-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.240499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca" (OuterVolumeSpecName: "client-ca") pod "6590d259-1d9c-41e2-b070-5e7a1fa53d34" (UID: "6590d259-1d9c-41e2-b070-5e7a1fa53d34"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.240562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config" (OuterVolumeSpecName: "config") pod "6590d259-1d9c-41e2-b070-5e7a1fa53d34" (UID: "6590d259-1d9c-41e2-b070-5e7a1fa53d34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.242472 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6590d259-1d9c-41e2-b070-5e7a1fa53d34" (UID: "6590d259-1d9c-41e2-b070-5e7a1fa53d34"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.243619 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g" (OuterVolumeSpecName: "kube-api-access-gjz4g") pod "6590d259-1d9c-41e2-b070-5e7a1fa53d34" (UID: "6590d259-1d9c-41e2-b070-5e7a1fa53d34"). InnerVolumeSpecName "kube-api-access-gjz4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.341349 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjz4g\" (UniqueName: \"kubernetes.io/projected/6590d259-1d9c-41e2-b070-5e7a1fa53d34-kube-api-access-gjz4g\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.341383 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.341395 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6590d259-1d9c-41e2-b070-5e7a1fa53d34-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.341405 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6590d259-1d9c-41e2-b070-5e7a1fa53d34-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.396526 4764 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.506448 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.513610 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-2sfl6"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.553081 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" path="/var/lib/kubelet/pods/6590d259-1d9c-41e2-b070-5e7a1fa53d34/volumes" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.553835 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" path="/var/lib/kubelet/pods/a44e8e46-19c5-4242-8186-12ec04167e59/volumes" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.907389 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.908886 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" containerName="route-controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.908988 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" containerName="route-controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.909071 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909186 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.909284 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" containerName="controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909370 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" containerName="controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: E1203 23:46:04.909463 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" containerName="collect-profiles" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909569 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" containerName="collect-profiles" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909797 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909898 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44e8e46-19c5-4242-8186-12ec04167e59" containerName="controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.909996 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" containerName="collect-profiles" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.910096 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6590d259-1d9c-41e2-b070-5e7a1fa53d34" containerName="route-controller-manager" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.910556 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.910652 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.911318 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.912862 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.913064 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.913394 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.914270 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.915213 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.918645 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.918669 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.918852 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.918969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.919006 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.919112 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.919196 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.924820 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.925277 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.928737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62tjx\" (UniqueName: \"kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950619 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-999jb\" (UniqueName: \"kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:04 crc kubenswrapper[4764]: I1203 23:46:04.950844 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.051945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-999jb\" (UniqueName: \"kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052133 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62tjx\" (UniqueName: \"kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052205 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052236 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.052373 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.053497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.053613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.054185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.054958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.056074 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.057613 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.061229 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.080829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62tjx\" (UniqueName: \"kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx\") pod \"route-controller-manager-7b8845b8cc-26pgr\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.082576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-999jb\" (UniqueName: \"kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb\") pod \"controller-manager-587dcdb4c5-fkl7r\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.240146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.245161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.539615 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:05 crc kubenswrapper[4764]: W1203 23:46:05.548804 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301a2b3b_5048_41f0_8761_0d919567a8c7.slice/crio-8774a4e1ea12a831ca4dfb4513c3499d24c91dd4b24a51d2053dbb98d9f987fe WatchSource:0}: Error finding container 8774a4e1ea12a831ca4dfb4513c3499d24c91dd4b24a51d2053dbb98d9f987fe: Status 404 returned error can't find the container with id 8774a4e1ea12a831ca4dfb4513c3499d24c91dd4b24a51d2053dbb98d9f987fe Dec 03 23:46:05 crc kubenswrapper[4764]: I1203 23:46:05.798650 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:05 crc kubenswrapper[4764]: W1203 23:46:05.806399 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fba4428_e713_44b0_bd41_0c0995578e56.slice/crio-56405b0fba3124e466295bda948a48da0947ff5a25e4e8e3c4b38f18a374d1eb WatchSource:0}: Error finding container 56405b0fba3124e466295bda948a48da0947ff5a25e4e8e3c4b38f18a374d1eb: Status 404 returned error can't find the container with id 56405b0fba3124e466295bda948a48da0947ff5a25e4e8e3c4b38f18a374d1eb Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.166501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" event={"ID":"9fba4428-e713-44b0-bd41-0c0995578e56","Type":"ContainerStarted","Data":"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b"} Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.166910 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.166936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" event={"ID":"9fba4428-e713-44b0-bd41-0c0995578e56","Type":"ContainerStarted","Data":"56405b0fba3124e466295bda948a48da0947ff5a25e4e8e3c4b38f18a374d1eb"} Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.168611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" event={"ID":"301a2b3b-5048-41f0-8761-0d919567a8c7","Type":"ContainerStarted","Data":"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3"} Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.168653 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" event={"ID":"301a2b3b-5048-41f0-8761-0d919567a8c7","Type":"ContainerStarted","Data":"8774a4e1ea12a831ca4dfb4513c3499d24c91dd4b24a51d2053dbb98d9f987fe"} Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.168839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.172282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.174270 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.186920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" podStartSLOduration=3.186894993 podStartE2EDuration="3.186894993s" podCreationTimestamp="2025-12-03 23:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:06.184100228 +0000 UTC m=+301.945424649" watchObservedRunningTime="2025-12-03 23:46:06.186894993 +0000 UTC m=+301.948219414" Dec 03 23:46:06 crc kubenswrapper[4764]: I1203 23:46:06.208574 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" podStartSLOduration=3.20855605 podStartE2EDuration="3.20855605s" podCreationTimestamp="2025-12-03 23:46:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:06.206633709 +0000 UTC m=+301.967958130" watchObservedRunningTime="2025-12-03 23:46:06.20855605 +0000 UTC m=+301.969880481" Dec 03 23:46:07 crc kubenswrapper[4764]: I1203 23:46:07.337922 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:07 crc kubenswrapper[4764]: I1203 23:46:07.359981 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.183845 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" podUID="301a2b3b-5048-41f0-8761-0d919567a8c7" containerName="route-controller-manager" containerID="cri-o://84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3" gracePeriod=30 Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.183906 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" podUID="9fba4428-e713-44b0-bd41-0c0995578e56" containerName="controller-manager" containerID="cri-o://9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b" gracePeriod=30 Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.710249 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.752085 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:09 crc kubenswrapper[4764]: E1203 23:46:09.752451 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="301a2b3b-5048-41f0-8761-0d919567a8c7" containerName="route-controller-manager" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.752480 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="301a2b3b-5048-41f0-8761-0d919567a8c7" containerName="route-controller-manager" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.752664 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="301a2b3b-5048-41f0-8761-0d919567a8c7" containerName="route-controller-manager" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.753331 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.760987 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.769015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles\") pod \"9fba4428-e713-44b0-bd41-0c0995578e56\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert\") pod \"301a2b3b-5048-41f0-8761-0d919567a8c7\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config\") pod \"9fba4428-e713-44b0-bd41-0c0995578e56\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-999jb\" (UniqueName: \"kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb\") pod \"9fba4428-e713-44b0-bd41-0c0995578e56\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca\") pod \"301a2b3b-5048-41f0-8761-0d919567a8c7\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822868 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca\") pod \"9fba4428-e713-44b0-bd41-0c0995578e56\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822907 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert\") pod \"9fba4428-e713-44b0-bd41-0c0995578e56\" (UID: \"9fba4428-e713-44b0-bd41-0c0995578e56\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62tjx\" (UniqueName: \"kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx\") pod \"301a2b3b-5048-41f0-8761-0d919567a8c7\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.822963 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config\") pod \"301a2b3b-5048-41f0-8761-0d919567a8c7\" (UID: \"301a2b3b-5048-41f0-8761-0d919567a8c7\") " Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.823155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.823185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.823246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdrh\" (UniqueName: \"kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.823268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.824407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config" (OuterVolumeSpecName: "config") pod "301a2b3b-5048-41f0-8761-0d919567a8c7" (UID: "301a2b3b-5048-41f0-8761-0d919567a8c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.824865 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9fba4428-e713-44b0-bd41-0c0995578e56" (UID: "9fba4428-e713-44b0-bd41-0c0995578e56"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.824891 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca" (OuterVolumeSpecName: "client-ca") pod "9fba4428-e713-44b0-bd41-0c0995578e56" (UID: "9fba4428-e713-44b0-bd41-0c0995578e56"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.825074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config" (OuterVolumeSpecName: "config") pod "9fba4428-e713-44b0-bd41-0c0995578e56" (UID: "9fba4428-e713-44b0-bd41-0c0995578e56"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.829585 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "301a2b3b-5048-41f0-8761-0d919567a8c7" (UID: "301a2b3b-5048-41f0-8761-0d919567a8c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.829814 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "301a2b3b-5048-41f0-8761-0d919567a8c7" (UID: "301a2b3b-5048-41f0-8761-0d919567a8c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.829935 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9fba4428-e713-44b0-bd41-0c0995578e56" (UID: "9fba4428-e713-44b0-bd41-0c0995578e56"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.830526 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb" (OuterVolumeSpecName: "kube-api-access-999jb") pod "9fba4428-e713-44b0-bd41-0c0995578e56" (UID: "9fba4428-e713-44b0-bd41-0c0995578e56"). InnerVolumeSpecName "kube-api-access-999jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.831173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx" (OuterVolumeSpecName: "kube-api-access-62tjx") pod "301a2b3b-5048-41f0-8761-0d919567a8c7" (UID: "301a2b3b-5048-41f0-8761-0d919567a8c7"). InnerVolumeSpecName "kube-api-access-62tjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.925371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdrh\" (UniqueName: \"kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.925686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.926929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927669 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927685 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927700 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/301a2b3b-5048-41f0-8761-0d919567a8c7-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927828 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927842 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-999jb\" (UniqueName: \"kubernetes.io/projected/9fba4428-e713-44b0-bd41-0c0995578e56-kube-api-access-999jb\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927855 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/301a2b3b-5048-41f0-8761-0d919567a8c7-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927867 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9fba4428-e713-44b0-bd41-0c0995578e56-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927879 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fba4428-e713-44b0-bd41-0c0995578e56-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.927891 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62tjx\" (UniqueName: \"kubernetes.io/projected/301a2b3b-5048-41f0-8761-0d919567a8c7-kube-api-access-62tjx\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.928510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.934249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:09 crc kubenswrapper[4764]: I1203 23:46:09.944678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdrh\" (UniqueName: \"kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh\") pod \"route-controller-manager-77b5c7dfdc-hjf7p\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.087420 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.191090 4764 generic.go:334] "Generic (PLEG): container finished" podID="9fba4428-e713-44b0-bd41-0c0995578e56" containerID="9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b" exitCode=0 Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.191173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" event={"ID":"9fba4428-e713-44b0-bd41-0c0995578e56","Type":"ContainerDied","Data":"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b"} Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.191184 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.191215 4764 scope.go:117] "RemoveContainer" containerID="9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.191203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r" event={"ID":"9fba4428-e713-44b0-bd41-0c0995578e56","Type":"ContainerDied","Data":"56405b0fba3124e466295bda948a48da0947ff5a25e4e8e3c4b38f18a374d1eb"} Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.193629 4764 generic.go:334] "Generic (PLEG): container finished" podID="301a2b3b-5048-41f0-8761-0d919567a8c7" containerID="84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3" exitCode=0 Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.193659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" event={"ID":"301a2b3b-5048-41f0-8761-0d919567a8c7","Type":"ContainerDied","Data":"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3"} Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.193675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" event={"ID":"301a2b3b-5048-41f0-8761-0d919567a8c7","Type":"ContainerDied","Data":"8774a4e1ea12a831ca4dfb4513c3499d24c91dd4b24a51d2053dbb98d9f987fe"} Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.193738 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.213793 4764 scope.go:117] "RemoveContainer" containerID="9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b" Dec 03 23:46:10 crc kubenswrapper[4764]: E1203 23:46:10.214380 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b\": container with ID starting with 9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b not found: ID does not exist" containerID="9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.214412 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b"} err="failed to get container status \"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b\": rpc error: code = NotFound desc = could not find container \"9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b\": container with ID starting with 9d8447dbaee17aae5893c3bbc8b9a6b01198781f658d58dca510f88e56a5696b not found: ID does not exist" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.214434 4764 scope.go:117] "RemoveContainer" containerID="84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.229380 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.233533 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-587dcdb4c5-fkl7r"] Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.242622 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.245471 4764 scope.go:117] "RemoveContainer" containerID="84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3" Dec 03 23:46:10 crc kubenswrapper[4764]: E1203 23:46:10.245868 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3\": container with ID starting with 84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3 not found: ID does not exist" containerID="84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.245912 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3"} err="failed to get container status \"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3\": rpc error: code = NotFound desc = could not find container \"84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3\": container with ID starting with 84e1c383dba69f3fe33b28971709dfab0d95d9251f1676e25d56ee27db46a3c3 not found: ID does not exist" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.246666 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b8845b8cc-26pgr"] Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.269253 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:10 crc kubenswrapper[4764]: W1203 23:46:10.272067 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a56965c_900a_4ad7_96a8_c62802e2251c.slice/crio-4e5f9f37876d59a17fdfbc5b9f422bed080bf636d7a59409730e304e50976add WatchSource:0}: Error finding container 4e5f9f37876d59a17fdfbc5b9f422bed080bf636d7a59409730e304e50976add: Status 404 returned error can't find the container with id 4e5f9f37876d59a17fdfbc5b9f422bed080bf636d7a59409730e304e50976add Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.551161 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301a2b3b-5048-41f0-8761-0d919567a8c7" path="/var/lib/kubelet/pods/301a2b3b-5048-41f0-8761-0d919567a8c7/volumes" Dec 03 23:46:10 crc kubenswrapper[4764]: I1203 23:46:10.551671 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fba4428-e713-44b0-bd41-0c0995578e56" path="/var/lib/kubelet/pods/9fba4428-e713-44b0-bd41-0c0995578e56/volumes" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.202696 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" event={"ID":"1a56965c-900a-4ad7-96a8-c62802e2251c","Type":"ContainerStarted","Data":"a70dea67c5b8c861e776c68f60c5081b12b5c97466aa605a1a6ad91f3ea683cb"} Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.202762 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" event={"ID":"1a56965c-900a-4ad7-96a8-c62802e2251c","Type":"ContainerStarted","Data":"4e5f9f37876d59a17fdfbc5b9f422bed080bf636d7a59409730e304e50976add"} Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.203947 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.210497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.256680 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" podStartSLOduration=4.2566590380000005 podStartE2EDuration="4.256659038s" podCreationTimestamp="2025-12-03 23:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:11.233917593 +0000 UTC m=+306.995242024" watchObservedRunningTime="2025-12-03 23:46:11.256659038 +0000 UTC m=+307.017983459" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.918035 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:11 crc kubenswrapper[4764]: E1203 23:46:11.918823 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fba4428-e713-44b0-bd41-0c0995578e56" containerName="controller-manager" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.918859 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fba4428-e713-44b0-bd41-0c0995578e56" containerName="controller-manager" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.919314 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fba4428-e713-44b0-bd41-0c0995578e56" containerName="controller-manager" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.920070 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.925488 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.926280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.927610 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.928147 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.928417 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.931099 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.936116 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.939303 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.956699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.956869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.956936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.957027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzczt\" (UniqueName: \"kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:11 crc kubenswrapper[4764]: I1203 23:46:11.957104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.058151 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.058262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.058309 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.058369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzczt\" (UniqueName: \"kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.059339 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.060200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.060368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.061416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.068476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.082898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzczt\" (UniqueName: \"kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt\") pod \"controller-manager-5488f76589-48fvw\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.245824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:12 crc kubenswrapper[4764]: I1203 23:46:12.485423 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:12 crc kubenswrapper[4764]: W1203 23:46:12.496708 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04068f8c_58e1_40c1_9d6c_1f87e2705f52.slice/crio-cf3045c55f1094195d8166abec6fb435212b2150d65dc0c49887d9a295144994 WatchSource:0}: Error finding container cf3045c55f1094195d8166abec6fb435212b2150d65dc0c49887d9a295144994: Status 404 returned error can't find the container with id cf3045c55f1094195d8166abec6fb435212b2150d65dc0c49887d9a295144994 Dec 03 23:46:13 crc kubenswrapper[4764]: I1203 23:46:13.217428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" event={"ID":"04068f8c-58e1-40c1-9d6c-1f87e2705f52","Type":"ContainerStarted","Data":"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86"} Dec 03 23:46:13 crc kubenswrapper[4764]: I1203 23:46:13.217485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" event={"ID":"04068f8c-58e1-40c1-9d6c-1f87e2705f52","Type":"ContainerStarted","Data":"cf3045c55f1094195d8166abec6fb435212b2150d65dc0c49887d9a295144994"} Dec 03 23:46:13 crc kubenswrapper[4764]: I1203 23:46:13.217742 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:13 crc kubenswrapper[4764]: I1203 23:46:13.225990 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:13 crc kubenswrapper[4764]: I1203 23:46:13.234620 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" podStartSLOduration=6.2345949449999996 podStartE2EDuration="6.234594945s" podCreationTimestamp="2025-12-03 23:46:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:13.23178399 +0000 UTC m=+308.993108441" watchObservedRunningTime="2025-12-03 23:46:13.234594945 +0000 UTC m=+308.995919366" Dec 03 23:46:15 crc kubenswrapper[4764]: I1203 23:46:15.812611 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:15 crc kubenswrapper[4764]: I1203 23:46:15.821915 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:15 crc kubenswrapper[4764]: I1203 23:46:15.822182 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" podUID="1a56965c-900a-4ad7-96a8-c62802e2251c" containerName="route-controller-manager" containerID="cri-o://a70dea67c5b8c861e776c68f60c5081b12b5c97466aa605a1a6ad91f3ea683cb" gracePeriod=30 Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.237839 4764 generic.go:334] "Generic (PLEG): container finished" podID="1a56965c-900a-4ad7-96a8-c62802e2251c" containerID="a70dea67c5b8c861e776c68f60c5081b12b5c97466aa605a1a6ad91f3ea683cb" exitCode=0 Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.237915 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" event={"ID":"1a56965c-900a-4ad7-96a8-c62802e2251c","Type":"ContainerDied","Data":"a70dea67c5b8c861e776c68f60c5081b12b5c97466aa605a1a6ad91f3ea683cb"} Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.238134 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" podUID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" containerName="controller-manager" containerID="cri-o://c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86" gracePeriod=30 Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.778087 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.810878 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.837036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config\") pod \"1a56965c-900a-4ad7-96a8-c62802e2251c\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.837136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert\") pod \"1a56965c-900a-4ad7-96a8-c62802e2251c\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.837170 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmdrh\" (UniqueName: \"kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh\") pod \"1a56965c-900a-4ad7-96a8-c62802e2251c\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.837205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca\") pod \"1a56965c-900a-4ad7-96a8-c62802e2251c\" (UID: \"1a56965c-900a-4ad7-96a8-c62802e2251c\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.838190 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config" (OuterVolumeSpecName: "config") pod "1a56965c-900a-4ad7-96a8-c62802e2251c" (UID: "1a56965c-900a-4ad7-96a8-c62802e2251c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.838302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca" (OuterVolumeSpecName: "client-ca") pod "1a56965c-900a-4ad7-96a8-c62802e2251c" (UID: "1a56965c-900a-4ad7-96a8-c62802e2251c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.844817 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh" (OuterVolumeSpecName: "kube-api-access-zmdrh") pod "1a56965c-900a-4ad7-96a8-c62802e2251c" (UID: "1a56965c-900a-4ad7-96a8-c62802e2251c"). InnerVolumeSpecName "kube-api-access-zmdrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.845914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1a56965c-900a-4ad7-96a8-c62802e2251c" (UID: "1a56965c-900a-4ad7-96a8-c62802e2251c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.919946 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:16 crc kubenswrapper[4764]: E1203 23:46:16.920418 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" containerName="controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.920456 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" containerName="controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: E1203 23:46:16.920512 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a56965c-900a-4ad7-96a8-c62802e2251c" containerName="route-controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.920526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a56965c-900a-4ad7-96a8-c62802e2251c" containerName="route-controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.920780 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a56965c-900a-4ad7-96a8-c62802e2251c" containerName="route-controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.920816 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" containerName="controller-manager" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.921427 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.928160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.929344 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.934117 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.938466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzczt\" (UniqueName: \"kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt\") pod \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.938550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles\") pod \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.938583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config\") pod \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.938627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert\") pod \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.938758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca\") pod \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\" (UID: \"04068f8c-58e1-40c1-9d6c-1f87e2705f52\") " Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939083 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939104 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a56965c-900a-4ad7-96a8-c62802e2251c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939125 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmdrh\" (UniqueName: \"kubernetes.io/projected/1a56965c-900a-4ad7-96a8-c62802e2251c-kube-api-access-zmdrh\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939142 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1a56965c-900a-4ad7-96a8-c62802e2251c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939811 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "04068f8c-58e1-40c1-9d6c-1f87e2705f52" (UID: "04068f8c-58e1-40c1-9d6c-1f87e2705f52"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.939874 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca" (OuterVolumeSpecName: "client-ca") pod "04068f8c-58e1-40c1-9d6c-1f87e2705f52" (UID: "04068f8c-58e1-40c1-9d6c-1f87e2705f52"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.940117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config" (OuterVolumeSpecName: "config") pod "04068f8c-58e1-40c1-9d6c-1f87e2705f52" (UID: "04068f8c-58e1-40c1-9d6c-1f87e2705f52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.943739 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt" (OuterVolumeSpecName: "kube-api-access-pzczt") pod "04068f8c-58e1-40c1-9d6c-1f87e2705f52" (UID: "04068f8c-58e1-40c1-9d6c-1f87e2705f52"). InnerVolumeSpecName "kube-api-access-pzczt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.943771 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04068f8c-58e1-40c1-9d6c-1f87e2705f52" (UID: "04068f8c-58e1-40c1-9d6c-1f87e2705f52"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:16 crc kubenswrapper[4764]: I1203 23:46:16.955412 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w9j4\" (UniqueName: \"kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040533 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040755 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040835 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc22h\" (UniqueName: \"kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040862 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.040955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.041051 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04068f8c-58e1-40c1-9d6c-1f87e2705f52-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.041064 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.041077 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzczt\" (UniqueName: \"kubernetes.io/projected/04068f8c-58e1-40c1-9d6c-1f87e2705f52-kube-api-access-pzczt\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.041085 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.041094 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04068f8c-58e1-40c1-9d6c-1f87e2705f52-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w9j4\" (UniqueName: \"kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc22h\" (UniqueName: \"kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.142908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.144688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.144696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.144737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.145677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.147303 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.148946 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.149273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.164983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc22h\" (UniqueName: \"kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h\") pod \"route-controller-manager-79b6b76b5b-mv8mj\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.179093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w9j4\" (UniqueName: \"kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4\") pod \"controller-manager-75858d884b-dgd8q\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.244446 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.244960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p" event={"ID":"1a56965c-900a-4ad7-96a8-c62802e2251c","Type":"ContainerDied","Data":"4e5f9f37876d59a17fdfbc5b9f422bed080bf636d7a59409730e304e50976add"} Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.245147 4764 scope.go:117] "RemoveContainer" containerID="a70dea67c5b8c861e776c68f60c5081b12b5c97466aa605a1a6ad91f3ea683cb" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.247380 4764 generic.go:334] "Generic (PLEG): container finished" podID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" containerID="c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86" exitCode=0 Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.247521 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" event={"ID":"04068f8c-58e1-40c1-9d6c-1f87e2705f52","Type":"ContainerDied","Data":"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86"} Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.247638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" event={"ID":"04068f8c-58e1-40c1-9d6c-1f87e2705f52","Type":"ContainerDied","Data":"cf3045c55f1094195d8166abec6fb435212b2150d65dc0c49887d9a295144994"} Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.247826 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-48fvw" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.264568 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.270185 4764 scope.go:117] "RemoveContainer" containerID="c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.275930 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.282394 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.298999 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-hjf7p"] Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.316337 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.321368 4764 scope.go:117] "RemoveContainer" containerID="c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86" Dec 03 23:46:17 crc kubenswrapper[4764]: E1203 23:46:17.321896 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86\": container with ID starting with c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86 not found: ID does not exist" containerID="c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.321967 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86"} err="failed to get container status \"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86\": rpc error: code = NotFound desc = could not find container \"c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86\": container with ID starting with c975297e5207ef775bc7fbfb5a4004649c0c05bc545d39cb4b8efb51e2ea9b86 not found: ID does not exist" Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.327674 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-48fvw"] Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.514510 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:17 crc kubenswrapper[4764]: W1203 23:46:17.524506 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dcd6240_fcf0_4c40_90ad_3285fcb4d46b.slice/crio-3a2fadd306bef8796aafb742fcd18ddaf4efc5ba0e62042945c9b3b688ea4e97 WatchSource:0}: Error finding container 3a2fadd306bef8796aafb742fcd18ddaf4efc5ba0e62042945c9b3b688ea4e97: Status 404 returned error can't find the container with id 3a2fadd306bef8796aafb742fcd18ddaf4efc5ba0e62042945c9b3b688ea4e97 Dec 03 23:46:17 crc kubenswrapper[4764]: I1203 23:46:17.556585 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:17 crc kubenswrapper[4764]: W1203 23:46:17.562669 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9791cec8_789e_41c8_8a46_96bcec6873fd.slice/crio-2923531dabe8ef1a8e6c7e356818999c653dad689940161f01e1510cfd0a798e WatchSource:0}: Error finding container 2923531dabe8ef1a8e6c7e356818999c653dad689940161f01e1510cfd0a798e: Status 404 returned error can't find the container with id 2923531dabe8ef1a8e6c7e356818999c653dad689940161f01e1510cfd0a798e Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.263986 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" event={"ID":"9791cec8-789e-41c8-8a46-96bcec6873fd","Type":"ContainerStarted","Data":"d755b5b722545c08d80659f97b9f6bdf5361267f022c397182aa39650ec0005a"} Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.264384 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.264399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" event={"ID":"9791cec8-789e-41c8-8a46-96bcec6873fd","Type":"ContainerStarted","Data":"2923531dabe8ef1a8e6c7e356818999c653dad689940161f01e1510cfd0a798e"} Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.266583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" event={"ID":"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b","Type":"ContainerStarted","Data":"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542"} Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.266611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" event={"ID":"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b","Type":"ContainerStarted","Data":"3a2fadd306bef8796aafb742fcd18ddaf4efc5ba0e62042945c9b3b688ea4e97"} Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.267140 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.273418 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.273809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.286734 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" podStartSLOduration=3.286694669 podStartE2EDuration="3.286694669s" podCreationTimestamp="2025-12-03 23:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:18.283812562 +0000 UTC m=+314.045136983" watchObservedRunningTime="2025-12-03 23:46:18.286694669 +0000 UTC m=+314.048019090" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.338447 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" podStartSLOduration=3.338427237 podStartE2EDuration="3.338427237s" podCreationTimestamp="2025-12-03 23:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:18.307130713 +0000 UTC m=+314.068455124" watchObservedRunningTime="2025-12-03 23:46:18.338427237 +0000 UTC m=+314.099751658" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.436510 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.556935 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04068f8c-58e1-40c1-9d6c-1f87e2705f52" path="/var/lib/kubelet/pods/04068f8c-58e1-40c1-9d6c-1f87e2705f52/volumes" Dec 03 23:46:18 crc kubenswrapper[4764]: I1203 23:46:18.558129 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a56965c-900a-4ad7-96a8-c62802e2251c" path="/var/lib/kubelet/pods/1a56965c-900a-4ad7-96a8-c62802e2251c/volumes" Dec 03 23:46:23 crc kubenswrapper[4764]: I1203 23:46:23.675408 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:23 crc kubenswrapper[4764]: I1203 23:46:23.676138 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" podUID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" containerName="controller-manager" containerID="cri-o://ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542" gracePeriod=30 Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.180194 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.262463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert\") pod \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.262535 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config\") pod \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.262567 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca\") pod \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.263560 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config" (OuterVolumeSpecName: "config") pod "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" (UID: "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.263587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca" (OuterVolumeSpecName: "client-ca") pod "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" (UID: "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.263633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles\") pod \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.264233 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" (UID: "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.264318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w9j4\" (UniqueName: \"kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4\") pod \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\" (UID: \"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b\") " Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.264909 4764 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.264931 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.264943 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.268241 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" (UID: "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.268260 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4" (OuterVolumeSpecName: "kube-api-access-4w9j4") pod "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" (UID: "0dcd6240-fcf0-4c40-90ad-3285fcb4d46b"). InnerVolumeSpecName "kube-api-access-4w9j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.308886 4764 generic.go:334] "Generic (PLEG): container finished" podID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" containerID="ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542" exitCode=0 Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.308924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" event={"ID":"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b","Type":"ContainerDied","Data":"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542"} Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.308967 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" event={"ID":"0dcd6240-fcf0-4c40-90ad-3285fcb4d46b","Type":"ContainerDied","Data":"3a2fadd306bef8796aafb742fcd18ddaf4efc5ba0e62042945c9b3b688ea4e97"} Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.308972 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-75858d884b-dgd8q" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.308987 4764 scope.go:117] "RemoveContainer" containerID="ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.334840 4764 scope.go:117] "RemoveContainer" containerID="ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542" Dec 03 23:46:24 crc kubenswrapper[4764]: E1203 23:46:24.335854 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542\": container with ID starting with ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542 not found: ID does not exist" containerID="ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.335898 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542"} err="failed to get container status \"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542\": rpc error: code = NotFound desc = could not find container \"ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542\": container with ID starting with ad94aba9e37aa109cf3793518de122939149c31b7bcebab5e6a8620c3fdb2542 not found: ID does not exist" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.339989 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.343053 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-75858d884b-dgd8q"] Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.367843 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.367914 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w9j4\" (UniqueName: \"kubernetes.io/projected/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b-kube-api-access-4w9j4\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.558822 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" path="/var/lib/kubelet/pods/0dcd6240-fcf0-4c40-90ad-3285fcb4d46b/volumes" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.925081 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-nm6vr"] Dec 03 23:46:24 crc kubenswrapper[4764]: E1203 23:46:24.925639 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" containerName="controller-manager" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.925653 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" containerName="controller-manager" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.925856 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcd6240-fcf0-4c40-90ad-3285fcb4d46b" containerName="controller-manager" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.926367 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.931702 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.934693 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.934870 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.935256 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.935553 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.938521 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.941457 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.950398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-nm6vr"] Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.976108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-proxy-ca-bundles\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.976180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-serving-cert\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.976209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-client-ca\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.976241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsz9h\" (UniqueName: \"kubernetes.io/projected/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-kube-api-access-hsz9h\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:24 crc kubenswrapper[4764]: I1203 23:46:24.976287 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-config\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.077569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-serving-cert\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.077620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-client-ca\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.077647 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsz9h\" (UniqueName: \"kubernetes.io/projected/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-kube-api-access-hsz9h\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.077681 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-config\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.077729 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-proxy-ca-bundles\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.079251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-proxy-ca-bundles\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.079534 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-client-ca\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.079633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-config\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.082506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-serving-cert\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.109816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsz9h\" (UniqueName: \"kubernetes.io/projected/33e3fc59-d2a9-4daa-ba51-47552dd6fa42-kube-api-access-hsz9h\") pod \"controller-manager-5488f76589-nm6vr\" (UID: \"33e3fc59-d2a9-4daa-ba51-47552dd6fa42\") " pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.252974 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:25 crc kubenswrapper[4764]: I1203 23:46:25.561392 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5488f76589-nm6vr"] Dec 03 23:46:25 crc kubenswrapper[4764]: W1203 23:46:25.579814 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e3fc59_d2a9_4daa_ba51_47552dd6fa42.slice/crio-8df563eb1ea7c39dac9e0c041615d1fdc0074cc4de7e75f28f8059fc8c51497e WatchSource:0}: Error finding container 8df563eb1ea7c39dac9e0c041615d1fdc0074cc4de7e75f28f8059fc8c51497e: Status 404 returned error can't find the container with id 8df563eb1ea7c39dac9e0c041615d1fdc0074cc4de7e75f28f8059fc8c51497e Dec 03 23:46:26 crc kubenswrapper[4764]: I1203 23:46:26.332072 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" event={"ID":"33e3fc59-d2a9-4daa-ba51-47552dd6fa42","Type":"ContainerStarted","Data":"42b399ec24e2c033f9f75e4cf76b4123c5aba4deda627957748c1f8f7a4e86ec"} Dec 03 23:46:26 crc kubenswrapper[4764]: I1203 23:46:26.332405 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:26 crc kubenswrapper[4764]: I1203 23:46:26.332421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" event={"ID":"33e3fc59-d2a9-4daa-ba51-47552dd6fa42","Type":"ContainerStarted","Data":"8df563eb1ea7c39dac9e0c041615d1fdc0074cc4de7e75f28f8059fc8c51497e"} Dec 03 23:46:26 crc kubenswrapper[4764]: I1203 23:46:26.339243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" Dec 03 23:46:26 crc kubenswrapper[4764]: I1203 23:46:26.369730 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5488f76589-nm6vr" podStartSLOduration=3.369690758 podStartE2EDuration="3.369690758s" podCreationTimestamp="2025-12-03 23:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:26.352427868 +0000 UTC m=+322.113752279" watchObservedRunningTime="2025-12-03 23:46:26.369690758 +0000 UTC m=+322.131015169" Dec 03 23:46:43 crc kubenswrapper[4764]: I1203 23:46:43.683989 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:43 crc kubenswrapper[4764]: I1203 23:46:43.685017 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" podUID="9791cec8-789e-41c8-8a46-96bcec6873fd" containerName="route-controller-manager" containerID="cri-o://d755b5b722545c08d80659f97b9f6bdf5361267f022c397182aa39650ec0005a" gracePeriod=30 Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.447521 4764 generic.go:334] "Generic (PLEG): container finished" podID="9791cec8-789e-41c8-8a46-96bcec6873fd" containerID="d755b5b722545c08d80659f97b9f6bdf5361267f022c397182aa39650ec0005a" exitCode=0 Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.447604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" event={"ID":"9791cec8-789e-41c8-8a46-96bcec6873fd","Type":"ContainerDied","Data":"d755b5b722545c08d80659f97b9f6bdf5361267f022c397182aa39650ec0005a"} Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.682060 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.772153 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc22h\" (UniqueName: \"kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h\") pod \"9791cec8-789e-41c8-8a46-96bcec6873fd\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.772218 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca\") pod \"9791cec8-789e-41c8-8a46-96bcec6873fd\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.772336 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config\") pod \"9791cec8-789e-41c8-8a46-96bcec6873fd\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.772407 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert\") pod \"9791cec8-789e-41c8-8a46-96bcec6873fd\" (UID: \"9791cec8-789e-41c8-8a46-96bcec6873fd\") " Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.773063 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "9791cec8-789e-41c8-8a46-96bcec6873fd" (UID: "9791cec8-789e-41c8-8a46-96bcec6873fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.773303 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config" (OuterVolumeSpecName: "config") pod "9791cec8-789e-41c8-8a46-96bcec6873fd" (UID: "9791cec8-789e-41c8-8a46-96bcec6873fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.777886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h" (OuterVolumeSpecName: "kube-api-access-xc22h") pod "9791cec8-789e-41c8-8a46-96bcec6873fd" (UID: "9791cec8-789e-41c8-8a46-96bcec6873fd"). InnerVolumeSpecName "kube-api-access-xc22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.780820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9791cec8-789e-41c8-8a46-96bcec6873fd" (UID: "9791cec8-789e-41c8-8a46-96bcec6873fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.874177 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.874230 4764 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9791cec8-789e-41c8-8a46-96bcec6873fd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.874251 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc22h\" (UniqueName: \"kubernetes.io/projected/9791cec8-789e-41c8-8a46-96bcec6873fd-kube-api-access-xc22h\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.874270 4764 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9791cec8-789e-41c8-8a46-96bcec6873fd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.931978 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj"] Dec 03 23:46:44 crc kubenswrapper[4764]: E1203 23:46:44.932237 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9791cec8-789e-41c8-8a46-96bcec6873fd" containerName="route-controller-manager" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.932255 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9791cec8-789e-41c8-8a46-96bcec6873fd" containerName="route-controller-manager" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.932379 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9791cec8-789e-41c8-8a46-96bcec6873fd" containerName="route-controller-manager" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.932879 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:44 crc kubenswrapper[4764]: I1203 23:46:44.983643 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj"] Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.081477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-config\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.081576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-client-ca\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.081659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bde193-d92f-4cef-9737-1dd667456048-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.081856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg4tx\" (UniqueName: \"kubernetes.io/projected/67bde193-d92f-4cef-9737-1dd667456048-kube-api-access-pg4tx\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.182771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-config\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.182838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-client-ca\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.182873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bde193-d92f-4cef-9737-1dd667456048-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.182910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg4tx\" (UniqueName: \"kubernetes.io/projected/67bde193-d92f-4cef-9737-1dd667456048-kube-api-access-pg4tx\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.184108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-client-ca\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.184167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67bde193-d92f-4cef-9737-1dd667456048-config\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.188305 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67bde193-d92f-4cef-9737-1dd667456048-serving-cert\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.201592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg4tx\" (UniqueName: \"kubernetes.io/projected/67bde193-d92f-4cef-9737-1dd667456048-kube-api-access-pg4tx\") pod \"route-controller-manager-77b5c7dfdc-m8rwj\" (UID: \"67bde193-d92f-4cef-9737-1dd667456048\") " pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.272606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.456078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" event={"ID":"9791cec8-789e-41c8-8a46-96bcec6873fd","Type":"ContainerDied","Data":"2923531dabe8ef1a8e6c7e356818999c653dad689940161f01e1510cfd0a798e"} Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.456128 4764 scope.go:117] "RemoveContainer" containerID="d755b5b722545c08d80659f97b9f6bdf5361267f022c397182aa39650ec0005a" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.456240 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj" Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.497591 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.505201 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79b6b76b5b-mv8mj"] Dec 03 23:46:45 crc kubenswrapper[4764]: I1203 23:46:45.692983 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj"] Dec 03 23:46:45 crc kubenswrapper[4764]: W1203 23:46:45.701065 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bde193_d92f_4cef_9737_1dd667456048.slice/crio-e8afdf461052b76cd0c8abf94eafe28c4b1f070df8c25cc5de452417c42e546c WatchSource:0}: Error finding container e8afdf461052b76cd0c8abf94eafe28c4b1f070df8c25cc5de452417c42e546c: Status 404 returned error can't find the container with id e8afdf461052b76cd0c8abf94eafe28c4b1f070df8c25cc5de452417c42e546c Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.461598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" event={"ID":"67bde193-d92f-4cef-9737-1dd667456048","Type":"ContainerStarted","Data":"26ba585961b79a29f6d1f9ec86b9194d76d4eb494628045278b6941a35482bbc"} Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.461984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.462292 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" event={"ID":"67bde193-d92f-4cef-9737-1dd667456048","Type":"ContainerStarted","Data":"e8afdf461052b76cd0c8abf94eafe28c4b1f070df8c25cc5de452417c42e546c"} Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.467807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.477810 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-77b5c7dfdc-m8rwj" podStartSLOduration=3.477788707 podStartE2EDuration="3.477788707s" podCreationTimestamp="2025-12-03 23:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:46.47739449 +0000 UTC m=+342.238718911" watchObservedRunningTime="2025-12-03 23:46:46.477788707 +0000 UTC m=+342.239113118" Dec 03 23:46:46 crc kubenswrapper[4764]: I1203 23:46:46.555275 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9791cec8-789e-41c8-8a46-96bcec6873fd" path="/var/lib/kubelet/pods/9791cec8-789e-41c8-8a46-96bcec6873fd/volumes" Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.910701 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.911782 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4dlcs" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="registry-server" containerID="cri-o://6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b" gracePeriod=30 Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.940164 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.940462 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vn9sk" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="registry-server" containerID="cri-o://015d7437593405fa0e2a06ccb869ca9c4c3ee0e000bb2b2c21536573d6912baa" gracePeriod=30 Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.954646 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.955200 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" containerID="cri-o://687146a8cef817d1fd9d607dd124b2fe1cab41f26deb46632c372164215cf28f" gracePeriod=30 Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.961553 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.962091 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wdjgl" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="registry-server" containerID="cri-o://70c59ca6b10d02eab5718ee20fbecde9e963a55ebc182b1486cfbe913251ad13" gracePeriod=30 Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.972445 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.972844 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zz8jv" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="registry-server" containerID="cri-o://5196c7456fc8510942642cabb64ef1489895ef9ab749a596f4a9513709d585fd" gracePeriod=30 Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.976444 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.977462 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:48 crc kubenswrapper[4764]: I1203 23:46:48.978966 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.133505 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.133579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.133662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs4bs\" (UniqueName: \"kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.235201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.235563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.235629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs4bs\" (UniqueName: \"kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.236908 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.241158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.257004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs4bs\" (UniqueName: \"kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs\") pod \"marketplace-operator-79b997595-7mzk5\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.309919 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.445048 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.485842 4764 generic.go:334] "Generic (PLEG): container finished" podID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerID="015d7437593405fa0e2a06ccb869ca9c4c3ee0e000bb2b2c21536573d6912baa" exitCode=0 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.485921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerDied","Data":"015d7437593405fa0e2a06ccb869ca9c4c3ee0e000bb2b2c21536573d6912baa"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.488144 4764 generic.go:334] "Generic (PLEG): container finished" podID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerID="70c59ca6b10d02eab5718ee20fbecde9e963a55ebc182b1486cfbe913251ad13" exitCode=0 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.488198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerDied","Data":"70c59ca6b10d02eab5718ee20fbecde9e963a55ebc182b1486cfbe913251ad13"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.490586 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9415d09-8034-4627-80dc-ae731d9f466e" containerID="687146a8cef817d1fd9d607dd124b2fe1cab41f26deb46632c372164215cf28f" exitCode=0 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.490641 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerDied","Data":"687146a8cef817d1fd9d607dd124b2fe1cab41f26deb46632c372164215cf28f"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.490665 4764 scope.go:117] "RemoveContainer" containerID="96b52f286694278c724a4e1484a8a84187fcc67c763f8c345700208209105ba3" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.497197 4764 generic.go:334] "Generic (PLEG): container finished" podID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerID="6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b" exitCode=0 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.497280 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4dlcs" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.497284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerDied","Data":"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.497398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4dlcs" event={"ID":"3ef5d8b9-01e5-4668-aa91-0406a52a40ca","Type":"ContainerDied","Data":"fda7394c01e0633a382eb2af3af341e0a35cd5e611241654f852ddc3cdcc7b5d"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.500083 4764 generic.go:334] "Generic (PLEG): container finished" podID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerID="5196c7456fc8510942642cabb64ef1489895ef9ab749a596f4a9513709d585fd" exitCode=0 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.500113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerDied","Data":"5196c7456fc8510942642cabb64ef1489895ef9ab749a596f4a9513709d585fd"} Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.508094 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.516666 4764 scope.go:117] "RemoveContainer" containerID="6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.536813 4764 scope.go:117] "RemoveContainer" containerID="a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.538286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities\") pod \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.538371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content\") pod \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.538411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42drr\" (UniqueName: \"kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr\") pod \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\" (UID: \"3ef5d8b9-01e5-4668-aa91-0406a52a40ca\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.539295 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities" (OuterVolumeSpecName: "utilities") pod "3ef5d8b9-01e5-4668-aa91-0406a52a40ca" (UID: "3ef5d8b9-01e5-4668-aa91-0406a52a40ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.541147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr" (OuterVolumeSpecName: "kube-api-access-42drr") pod "3ef5d8b9-01e5-4668-aa91-0406a52a40ca" (UID: "3ef5d8b9-01e5-4668-aa91-0406a52a40ca"). InnerVolumeSpecName "kube-api-access-42drr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.562459 4764 scope.go:117] "RemoveContainer" containerID="c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.594440 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ef5d8b9-01e5-4668-aa91-0406a52a40ca" (UID: "3ef5d8b9-01e5-4668-aa91-0406a52a40ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.595643 4764 scope.go:117] "RemoveContainer" containerID="6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b" Dec 03 23:46:49 crc kubenswrapper[4764]: E1203 23:46:49.596353 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b\": container with ID starting with 6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b not found: ID does not exist" containerID="6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.596393 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b"} err="failed to get container status \"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b\": rpc error: code = NotFound desc = could not find container \"6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b\": container with ID starting with 6e0810948891b0073cc3dc6baaa8b729efc2b9ec796dead20fd4cfd1e2585f0b not found: ID does not exist" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.596420 4764 scope.go:117] "RemoveContainer" containerID="a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83" Dec 03 23:46:49 crc kubenswrapper[4764]: E1203 23:46:49.597094 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83\": container with ID starting with a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83 not found: ID does not exist" containerID="a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.597112 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83"} err="failed to get container status \"a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83\": rpc error: code = NotFound desc = could not find container \"a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83\": container with ID starting with a3fe28f51ff5d559c04a37b087a2285457cc63150656dc22630c4d79bf502e83 not found: ID does not exist" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.597124 4764 scope.go:117] "RemoveContainer" containerID="c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740" Dec 03 23:46:49 crc kubenswrapper[4764]: E1203 23:46:49.598207 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740\": container with ID starting with c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740 not found: ID does not exist" containerID="c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.598230 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740"} err="failed to get container status \"c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740\": rpc error: code = NotFound desc = could not find container \"c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740\": container with ID starting with c0fef74c99332ab34458282eed22f84a04804189e7baf0f74f5ce3abc1c0b740 not found: ID does not exist" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640216 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b4m\" (UniqueName: \"kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m\") pod \"84e0b7ae-01df-4863-b257-afb9a27507cd\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content\") pod \"84e0b7ae-01df-4863-b257-afb9a27507cd\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities\") pod \"84e0b7ae-01df-4863-b257-afb9a27507cd\" (UID: \"84e0b7ae-01df-4863-b257-afb9a27507cd\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640586 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640605 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42drr\" (UniqueName: \"kubernetes.io/projected/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-kube-api-access-42drr\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.640619 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ef5d8b9-01e5-4668-aa91-0406a52a40ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.643447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities" (OuterVolumeSpecName: "utilities") pod "84e0b7ae-01df-4863-b257-afb9a27507cd" (UID: "84e0b7ae-01df-4863-b257-afb9a27507cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.663548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m" (OuterVolumeSpecName: "kube-api-access-65b4m") pod "84e0b7ae-01df-4863-b257-afb9a27507cd" (UID: "84e0b7ae-01df-4863-b257-afb9a27507cd"). InnerVolumeSpecName "kube-api-access-65b4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.694481 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84e0b7ae-01df-4863-b257-afb9a27507cd" (UID: "84e0b7ae-01df-4863-b257-afb9a27507cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.718600 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.741448 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b4m\" (UniqueName: \"kubernetes.io/projected/84e0b7ae-01df-4863-b257-afb9a27507cd-kube-api-access-65b4m\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.741489 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.741503 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84e0b7ae-01df-4863-b257-afb9a27507cd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.741974 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.762657 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.826625 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.834431 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4dlcs"] Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics\") pod \"c9415d09-8034-4627-80dc-ae731d9f466e\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn2kq\" (UniqueName: \"kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq\") pod \"c9415d09-8034-4627-80dc-ae731d9f466e\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842325 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities\") pod \"74e0af5e-bc95-4918-9c09-524e159e1eba\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842362 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content\") pod \"74e0af5e-bc95-4918-9c09-524e159e1eba\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842394 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca\") pod \"c9415d09-8034-4627-80dc-ae731d9f466e\" (UID: \"c9415d09-8034-4627-80dc-ae731d9f466e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.842443 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqm79\" (UniqueName: \"kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79\") pod \"74e0af5e-bc95-4918-9c09-524e159e1eba\" (UID: \"74e0af5e-bc95-4918-9c09-524e159e1eba\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.843341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities" (OuterVolumeSpecName: "utilities") pod "74e0af5e-bc95-4918-9c09-524e159e1eba" (UID: "74e0af5e-bc95-4918-9c09-524e159e1eba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.844027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "c9415d09-8034-4627-80dc-ae731d9f466e" (UID: "c9415d09-8034-4627-80dc-ae731d9f466e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.844841 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79" (OuterVolumeSpecName: "kube-api-access-nqm79") pod "74e0af5e-bc95-4918-9c09-524e159e1eba" (UID: "74e0af5e-bc95-4918-9c09-524e159e1eba"). InnerVolumeSpecName "kube-api-access-nqm79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.845069 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq" (OuterVolumeSpecName: "kube-api-access-kn2kq") pod "c9415d09-8034-4627-80dc-ae731d9f466e" (UID: "c9415d09-8034-4627-80dc-ae731d9f466e"). InnerVolumeSpecName "kube-api-access-kn2kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.847571 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "c9415d09-8034-4627-80dc-ae731d9f466e" (UID: "c9415d09-8034-4627-80dc-ae731d9f466e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.863056 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74e0af5e-bc95-4918-9c09-524e159e1eba" (UID: "74e0af5e-bc95-4918-9c09-524e159e1eba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.872635 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:46:49 crc kubenswrapper[4764]: W1203 23:46:49.877462 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb613c3f6_59f7_46b1_90ba_09793e962453.slice/crio-780a3ccb1c4dacbdf4c5d027449808cadcc62d4faf4b59debec43949ce302f69 WatchSource:0}: Error finding container 780a3ccb1c4dacbdf4c5d027449808cadcc62d4faf4b59debec43949ce302f69: Status 404 returned error can't find the container with id 780a3ccb1c4dacbdf4c5d027449808cadcc62d4faf4b59debec43949ce302f69 Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943507 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities\") pod \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943547 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content\") pod \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943568 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v26s2\" (UniqueName: \"kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2\") pod \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\" (UID: \"8d09c74d-dc25-4769-9580-88f3ea4fcf8e\") " Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943850 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943861 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943872 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqm79\" (UniqueName: \"kubernetes.io/projected/74e0af5e-bc95-4918-9c09-524e159e1eba-kube-api-access-nqm79\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943880 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c9415d09-8034-4627-80dc-ae731d9f466e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943889 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn2kq\" (UniqueName: \"kubernetes.io/projected/c9415d09-8034-4627-80dc-ae731d9f466e-kube-api-access-kn2kq\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.943897 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74e0af5e-bc95-4918-9c09-524e159e1eba-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.945072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities" (OuterVolumeSpecName: "utilities") pod "8d09c74d-dc25-4769-9580-88f3ea4fcf8e" (UID: "8d09c74d-dc25-4769-9580-88f3ea4fcf8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:49 crc kubenswrapper[4764]: I1203 23:46:49.946101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2" (OuterVolumeSpecName: "kube-api-access-v26s2") pod "8d09c74d-dc25-4769-9580-88f3ea4fcf8e" (UID: "8d09c74d-dc25-4769-9580-88f3ea4fcf8e"). InnerVolumeSpecName "kube-api-access-v26s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.045118 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.045175 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v26s2\" (UniqueName: \"kubernetes.io/projected/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-kube-api-access-v26s2\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.048179 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d09c74d-dc25-4769-9580-88f3ea4fcf8e" (UID: "8d09c74d-dc25-4769-9580-88f3ea4fcf8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.145786 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d09c74d-dc25-4769-9580-88f3ea4fcf8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.507517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zz8jv" event={"ID":"8d09c74d-dc25-4769-9580-88f3ea4fcf8e","Type":"ContainerDied","Data":"d2d89ac0028e6b538f77e53570e757f07220f5d22677fa98597c8298ef26b95b"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.507567 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zz8jv" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.507589 4764 scope.go:117] "RemoveContainer" containerID="5196c7456fc8510942642cabb64ef1489895ef9ab749a596f4a9513709d585fd" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.511918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vn9sk" event={"ID":"84e0b7ae-01df-4863-b257-afb9a27507cd","Type":"ContainerDied","Data":"53cb7fa0d6d7ec42dbc894759835e28618e7fd0172d09c38c1929326316605e7"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.512043 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vn9sk" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.513246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" event={"ID":"b613c3f6-59f7-46b1-90ba-09793e962453","Type":"ContainerStarted","Data":"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.513299 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" event={"ID":"b613c3f6-59f7-46b1-90ba-09793e962453","Type":"ContainerStarted","Data":"780a3ccb1c4dacbdf4c5d027449808cadcc62d4faf4b59debec43949ce302f69"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.513327 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.517795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.520735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdjgl" event={"ID":"74e0af5e-bc95-4918-9c09-524e159e1eba","Type":"ContainerDied","Data":"3499adadd68626c5c313b03d2b97e181cf9a9ac71f447f9ac143f58fc523ff27"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.520829 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdjgl" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.521901 4764 scope.go:117] "RemoveContainer" containerID="cdd72437bc81ccf7c8bdf75d479473ecdc27f81a24ee6b24874d84a768356628" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.522374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" event={"ID":"c9415d09-8034-4627-80dc-ae731d9f466e","Type":"ContainerDied","Data":"dacc0af9bf867e357193602457d7abe033b3fcd97dfc9660b291a27ac8683003"} Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.522395 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sws9j" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.572854 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" podStartSLOduration=2.572832635 podStartE2EDuration="2.572832635s" podCreationTimestamp="2025-12-03 23:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:46:50.53328749 +0000 UTC m=+346.294611931" watchObservedRunningTime="2025-12-03 23:46:50.572832635 +0000 UTC m=+346.334157046" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.585828 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" path="/var/lib/kubelet/pods/3ef5d8b9-01e5-4668-aa91-0406a52a40ca/volumes" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.592005 4764 scope.go:117] "RemoveContainer" containerID="f737f2b3826e4e3014d4c9b02de2bbbe502f1b5ce922e18f72f125acd4b60af9" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.614325 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.619027 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zz8jv"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.621247 4764 scope.go:117] "RemoveContainer" containerID="015d7437593405fa0e2a06ccb869ca9c4c3ee0e000bb2b2c21536573d6912baa" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.625675 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.633579 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sws9j"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.639556 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.645658 4764 scope.go:117] "RemoveContainer" containerID="b7a79b4ea53875743255dc069570ee4f2e12b920a8486271434898051defd3a8" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.645892 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vn9sk"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.651560 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.654462 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdjgl"] Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.661734 4764 scope.go:117] "RemoveContainer" containerID="00aef84fa38aa27da61060ffdc0479899bbe578d526ad659144bf016f07c2697" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.677219 4764 scope.go:117] "RemoveContainer" containerID="70c59ca6b10d02eab5718ee20fbecde9e963a55ebc182b1486cfbe913251ad13" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.698517 4764 scope.go:117] "RemoveContainer" containerID="c2b05dc87e4023a12df727e7a3511cfa179c9185b6768573e54b8ec82e8e5158" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.710169 4764 scope.go:117] "RemoveContainer" containerID="c77752d707616d56cc26d3350cc2ecb3c2c94474602bea4019408fd1e67bc1ff" Dec 03 23:46:50 crc kubenswrapper[4764]: I1203 23:46:50.723391 4764 scope.go:117] "RemoveContainer" containerID="687146a8cef817d1fd9d607dd124b2fe1cab41f26deb46632c372164215cf28f" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898056 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898748 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898770 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898791 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898823 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898835 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898863 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898880 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898892 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898908 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.898920 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.898976 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899095 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899117 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899129 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899152 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899164 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="extract-utilities" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899180 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899193 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899209 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899221 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899238 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899250 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="extract-content" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899270 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899281 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: E1203 23:46:51.899295 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899306 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899465 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899496 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899511 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef5d8b9-01e5-4668-aa91-0406a52a40ca" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899533 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899551 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" containerName="registry-server" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.899868 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" containerName="marketplace-operator" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.900676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.905455 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.918208 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.970951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.971107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv467\" (UniqueName: \"kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:51 crc kubenswrapper[4764]: I1203 23:46:51.971225 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.072279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv467\" (UniqueName: \"kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.072400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.072510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.073025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.073224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.097476 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.098645 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.105873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.106638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.112831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv467\" (UniqueName: \"kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467\") pod \"certified-operators-567bg\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.173853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.173902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwkrk\" (UniqueName: \"kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.173942 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.224072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.275036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.275136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.275181 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwkrk\" (UniqueName: \"kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.276252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.276356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.292354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwkrk\" (UniqueName: \"kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk\") pod \"community-operators-579hd\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.460512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.557412 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e0af5e-bc95-4918-9c09-524e159e1eba" path="/var/lib/kubelet/pods/74e0af5e-bc95-4918-9c09-524e159e1eba/volumes" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.558321 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84e0b7ae-01df-4863-b257-afb9a27507cd" path="/var/lib/kubelet/pods/84e0b7ae-01df-4863-b257-afb9a27507cd/volumes" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.559149 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d09c74d-dc25-4769-9580-88f3ea4fcf8e" path="/var/lib/kubelet/pods/8d09c74d-dc25-4769-9580-88f3ea4fcf8e/volumes" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.560616 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9415d09-8034-4627-80dc-ae731d9f466e" path="/var/lib/kubelet/pods/c9415d09-8034-4627-80dc-ae731d9f466e/volumes" Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.640125 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:46:52 crc kubenswrapper[4764]: I1203 23:46:52.915607 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:46:52 crc kubenswrapper[4764]: W1203 23:46:52.923209 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf0a4d00_e173_46ad_9332_7b8cf8801cb3.slice/crio-1e974564b179831fbf30b60f6e58a786f4bcf29d2ca639556c0f11b77cdbac1d WatchSource:0}: Error finding container 1e974564b179831fbf30b60f6e58a786f4bcf29d2ca639556c0f11b77cdbac1d: Status 404 returned error can't find the container with id 1e974564b179831fbf30b60f6e58a786f4bcf29d2ca639556c0f11b77cdbac1d Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.558709 4764 generic.go:334] "Generic (PLEG): container finished" podID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerID="517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8" exitCode=0 Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.559050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerDied","Data":"517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8"} Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.559176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerStarted","Data":"8c08ade6d0a7667d4682bcd8cd68dde2bdc792961cf55a4d2b7dea7ae0b52103"} Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.563969 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerID="2df392272ed38275455b0ec92abf69b534418a1ddc35ac7ad84f002640a8022a" exitCode=0 Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.563994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerDied","Data":"2df392272ed38275455b0ec92abf69b534418a1ddc35ac7ad84f002640a8022a"} Dec 03 23:46:53 crc kubenswrapper[4764]: I1203 23:46:53.564027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerStarted","Data":"1e974564b179831fbf30b60f6e58a786f4bcf29d2ca639556c0f11b77cdbac1d"} Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.286834 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.288052 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.290399 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.298575 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.404704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v7wb\" (UniqueName: \"kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.404811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.404900 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.491823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.492755 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.494394 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.505422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.505536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.505931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.506169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.506169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v7wb\" (UniqueName: \"kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.505981 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.525986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v7wb\" (UniqueName: \"kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb\") pod \"redhat-marketplace-d4vk9\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.571622 4764 generic.go:334] "Generic (PLEG): container finished" podID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerID="783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73" exitCode=0 Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.571697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerDied","Data":"783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73"} Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.573212 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerID="6ebec57a5380ba2cec0c9f73de6734f6c26732f83ca84d87cb1f2485c061674a" exitCode=0 Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.573233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerDied","Data":"6ebec57a5380ba2cec0c9f73de6734f6c26732f83ca84d87cb1f2485c061674a"} Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.605465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.606855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.606898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt7km\" (UniqueName: \"kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.606948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.708483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.708852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.709446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.709557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.709187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt7km\" (UniqueName: \"kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.731739 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt7km\" (UniqueName: \"kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km\") pod \"redhat-operators-4q9cs\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:54 crc kubenswrapper[4764]: I1203 23:46:54.912015 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.004116 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.338884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:46:55 crc kubenswrapper[4764]: W1203 23:46:55.348385 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58504f1f_ebbc_4c05_a4fd_f68cfa3609ce.slice/crio-2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef WatchSource:0}: Error finding container 2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef: Status 404 returned error can't find the container with id 2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.595166 4764 generic.go:334] "Generic (PLEG): container finished" podID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerID="9281a039b8c25b036794888fbc3bd629b416cc75c0a587ef8549fcb5a040523a" exitCode=0 Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.595467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerDied","Data":"9281a039b8c25b036794888fbc3bd629b416cc75c0a587ef8549fcb5a040523a"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.595495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerStarted","Data":"f008e2a000be0f9c536930c973cb32481f2f875741ee79f8aa656940025e95cd"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.599177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerStarted","Data":"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.602256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerStarted","Data":"0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.604367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerStarted","Data":"f8dfbab296b15bcad9ebe856603a8f2587ef299ee93f0bb29067e9f724ff654f"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.604390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerStarted","Data":"2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef"} Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.629144 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-579hd" podStartSLOduration=2.219344899 podStartE2EDuration="3.62912826s" podCreationTimestamp="2025-12-03 23:46:52 +0000 UTC" firstStartedPulling="2025-12-03 23:46:53.566697212 +0000 UTC m=+349.328021623" lastFinishedPulling="2025-12-03 23:46:54.976480573 +0000 UTC m=+350.737804984" observedRunningTime="2025-12-03 23:46:55.626184704 +0000 UTC m=+351.387509115" watchObservedRunningTime="2025-12-03 23:46:55.62912826 +0000 UTC m=+351.390452671" Dec 03 23:46:55 crc kubenswrapper[4764]: I1203 23:46:55.642846 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-567bg" podStartSLOduration=3.2186969579999998 podStartE2EDuration="4.642826802s" podCreationTimestamp="2025-12-03 23:46:51 +0000 UTC" firstStartedPulling="2025-12-03 23:46:53.56340051 +0000 UTC m=+349.324724921" lastFinishedPulling="2025-12-03 23:46:54.987530354 +0000 UTC m=+350.748854765" observedRunningTime="2025-12-03 23:46:55.640602729 +0000 UTC m=+351.401927140" watchObservedRunningTime="2025-12-03 23:46:55.642826802 +0000 UTC m=+351.404151203" Dec 03 23:46:56 crc kubenswrapper[4764]: I1203 23:46:56.610648 4764 generic.go:334] "Generic (PLEG): container finished" podID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerID="f8dfbab296b15bcad9ebe856603a8f2587ef299ee93f0bb29067e9f724ff654f" exitCode=0 Dec 03 23:46:56 crc kubenswrapper[4764]: I1203 23:46:56.610758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerDied","Data":"f8dfbab296b15bcad9ebe856603a8f2587ef299ee93f0bb29067e9f724ff654f"} Dec 03 23:46:56 crc kubenswrapper[4764]: I1203 23:46:56.611049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerStarted","Data":"47fbb56ca12576a9bbf345ccee30b4e5b6fd3dc718754f8f3dcf59c1c2022da3"} Dec 03 23:46:56 crc kubenswrapper[4764]: I1203 23:46:56.612746 4764 generic.go:334] "Generic (PLEG): container finished" podID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerID="9d60c3cd6a766f4a972178f30a57623bfba473f991a42c23502bb614e651df32" exitCode=0 Dec 03 23:46:56 crc kubenswrapper[4764]: I1203 23:46:56.613857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerDied","Data":"9d60c3cd6a766f4a972178f30a57623bfba473f991a42c23502bb614e651df32"} Dec 03 23:46:57 crc kubenswrapper[4764]: I1203 23:46:57.622804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerStarted","Data":"a4cf9da8c6785ca4ac0659d435c7d3297e9503835bdd3640b6cb7f6b2a485c27"} Dec 03 23:46:57 crc kubenswrapper[4764]: I1203 23:46:57.624335 4764 generic.go:334] "Generic (PLEG): container finished" podID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerID="47fbb56ca12576a9bbf345ccee30b4e5b6fd3dc718754f8f3dcf59c1c2022da3" exitCode=0 Dec 03 23:46:57 crc kubenswrapper[4764]: I1203 23:46:57.624374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerDied","Data":"47fbb56ca12576a9bbf345ccee30b4e5b6fd3dc718754f8f3dcf59c1c2022da3"} Dec 03 23:46:57 crc kubenswrapper[4764]: I1203 23:46:57.644862 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d4vk9" podStartSLOduration=2.199279954 podStartE2EDuration="3.644841977s" podCreationTimestamp="2025-12-03 23:46:54 +0000 UTC" firstStartedPulling="2025-12-03 23:46:55.598141889 +0000 UTC m=+351.359466300" lastFinishedPulling="2025-12-03 23:46:57.043703912 +0000 UTC m=+352.805028323" observedRunningTime="2025-12-03 23:46:57.641123636 +0000 UTC m=+353.402448057" watchObservedRunningTime="2025-12-03 23:46:57.644841977 +0000 UTC m=+353.406166398" Dec 03 23:46:59 crc kubenswrapper[4764]: I1203 23:46:59.638664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerStarted","Data":"676d2771f19ef22634b8c77a3e146bdd263fdcecf537e9bf91485cd8f6176d02"} Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.224457 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.225079 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.298470 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.316596 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4q9cs" podStartSLOduration=5.8065833300000005 podStartE2EDuration="8.316548211s" podCreationTimestamp="2025-12-03 23:46:54 +0000 UTC" firstStartedPulling="2025-12-03 23:46:55.605222274 +0000 UTC m=+351.366546685" lastFinishedPulling="2025-12-03 23:46:58.115187155 +0000 UTC m=+353.876511566" observedRunningTime="2025-12-03 23:46:59.657607216 +0000 UTC m=+355.418931677" watchObservedRunningTime="2025-12-03 23:47:02.316548211 +0000 UTC m=+358.077872622" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.462155 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.462219 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.502051 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.690810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:47:02 crc kubenswrapper[4764]: I1203 23:47:02.691248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.606622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.607043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.650695 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.699504 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rjbtb"] Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.709465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.731413 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rjbtb"] Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.733284 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/201a5fe6-690a-49e0-9319-54e73c021743-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmxzg\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-kube-api-access-vmxzg\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841464 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-registry-tls\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/201a5fe6-690a-49e0-9319-54e73c021743-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841563 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-registry-certificates\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-trusted-ca\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841698 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.841761 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-bound-sa-token\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.879258 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.912161 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.912226 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/201a5fe6-690a-49e0-9319-54e73c021743-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943179 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmxzg\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-kube-api-access-vmxzg\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943205 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-registry-tls\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/201a5fe6-690a-49e0-9319-54e73c021743-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-registry-certificates\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943269 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-trusted-ca\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-bound-sa-token\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.943904 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/201a5fe6-690a-49e0-9319-54e73c021743-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.944667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-registry-certificates\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.944967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/201a5fe6-690a-49e0-9319-54e73c021743-trusted-ca\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.948851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-registry-tls\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.949375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/201a5fe6-690a-49e0-9319-54e73c021743-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.949426 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.958597 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmxzg\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-kube-api-access-vmxzg\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:04 crc kubenswrapper[4764]: I1203 23:47:04.968379 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/201a5fe6-690a-49e0-9319-54e73c021743-bound-sa-token\") pod \"image-registry-66df7c8f76-rjbtb\" (UID: \"201a5fe6-690a-49e0-9319-54e73c021743\") " pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:05 crc kubenswrapper[4764]: I1203 23:47:05.035183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:05 crc kubenswrapper[4764]: I1203 23:47:05.506519 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rjbtb"] Dec 03 23:47:05 crc kubenswrapper[4764]: I1203 23:47:05.675935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" event={"ID":"201a5fe6-690a-49e0-9319-54e73c021743","Type":"ContainerStarted","Data":"ff65515104dffecebfbed753a021e899f0e51abbe9068280abba0682ca50bda9"} Dec 03 23:47:05 crc kubenswrapper[4764]: I1203 23:47:05.718105 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:47:06 crc kubenswrapper[4764]: I1203 23:47:06.684859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" event={"ID":"201a5fe6-690a-49e0-9319-54e73c021743","Type":"ContainerStarted","Data":"4b6fb8fd0d95b3d1a3a3cb5152a264c20488652f41647274054542bb29ecfe6b"} Dec 03 23:47:07 crc kubenswrapper[4764]: I1203 23:47:07.689065 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:07 crc kubenswrapper[4764]: I1203 23:47:07.721273 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" podStartSLOduration=3.721255597 podStartE2EDuration="3.721255597s" podCreationTimestamp="2025-12-03 23:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:47:07.717633958 +0000 UTC m=+363.478958369" watchObservedRunningTime="2025-12-03 23:47:07.721255597 +0000 UTC m=+363.482580008" Dec 03 23:47:20 crc kubenswrapper[4764]: I1203 23:47:20.868806 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:47:20 crc kubenswrapper[4764]: I1203 23:47:20.869154 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:47:25 crc kubenswrapper[4764]: I1203 23:47:25.046383 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rjbtb" Dec 03 23:47:25 crc kubenswrapper[4764]: I1203 23:47:25.123056 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.168814 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" podUID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" containerName="registry" containerID="cri-o://2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c" gracePeriod=30 Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.664744 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.812534 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.812635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.812686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.812821 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.814073 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.814187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46czq\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.814250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.814472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\" (UID: \"c32eecc0-7e82-4d0b-bdbf-36fe53c01065\") " Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.814799 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.815178 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.816441 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.822326 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq" (OuterVolumeSpecName: "kube-api-access-46czq") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "kube-api-access-46czq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.822502 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.823000 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.823199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.844568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.869194 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.869275 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.914157 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c32eecc0-7e82-4d0b-bdbf-36fe53c01065" (UID: "c32eecc0-7e82-4d0b-bdbf-36fe53c01065"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.915936 4764 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.915964 4764 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.915974 4764 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.915983 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46czq\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-kube-api-access-46czq\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.915992 4764 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:50 crc kubenswrapper[4764]: I1203 23:47:50.916001 4764 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c32eecc0-7e82-4d0b-bdbf-36fe53c01065-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.000999 4764 generic.go:334] "Generic (PLEG): container finished" podID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" containerID="2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c" exitCode=0 Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.001042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" event={"ID":"c32eecc0-7e82-4d0b-bdbf-36fe53c01065","Type":"ContainerDied","Data":"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c"} Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.001069 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" event={"ID":"c32eecc0-7e82-4d0b-bdbf-36fe53c01065","Type":"ContainerDied","Data":"b5c058fa6fba1f321db5a31630c4859c2b116d77056742dbeaabf37ae7c27509"} Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.001085 4764 scope.go:117] "RemoveContainer" containerID="2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c" Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.001130 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-k47db" Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.030235 4764 scope.go:117] "RemoveContainer" containerID="2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c" Dec 03 23:47:51 crc kubenswrapper[4764]: E1203 23:47:51.030679 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c\": container with ID starting with 2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c not found: ID does not exist" containerID="2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c" Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.030781 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c"} err="failed to get container status \"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c\": rpc error: code = NotFound desc = could not find container \"2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c\": container with ID starting with 2415a459f6215689b324f5670afb2b2dc472e84a787ab4ee34c33bd72216d87c not found: ID does not exist" Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.053906 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:47:51 crc kubenswrapper[4764]: I1203 23:47:51.061357 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-k47db"] Dec 03 23:47:52 crc kubenswrapper[4764]: I1203 23:47:52.552290 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" path="/var/lib/kubelet/pods/c32eecc0-7e82-4d0b-bdbf-36fe53c01065/volumes" Dec 03 23:48:20 crc kubenswrapper[4764]: I1203 23:48:20.868836 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:48:20 crc kubenswrapper[4764]: I1203 23:48:20.869541 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:48:20 crc kubenswrapper[4764]: I1203 23:48:20.869630 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:48:20 crc kubenswrapper[4764]: I1203 23:48:20.870688 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:48:20 crc kubenswrapper[4764]: I1203 23:48:20.870842 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250" gracePeriod=600 Dec 03 23:48:21 crc kubenswrapper[4764]: I1203 23:48:21.213494 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250" exitCode=0 Dec 03 23:48:21 crc kubenswrapper[4764]: I1203 23:48:21.213543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250"} Dec 03 23:48:21 crc kubenswrapper[4764]: I1203 23:48:21.213887 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda"} Dec 03 23:48:21 crc kubenswrapper[4764]: I1203 23:48:21.213914 4764 scope.go:117] "RemoveContainer" containerID="c35e4a2c291587be0bfdf21ff69b34ebb4fd4f32b16689c4d1e53faeca99890c" Dec 03 23:50:04 crc kubenswrapper[4764]: I1203 23:50:04.909684 4764 scope.go:117] "RemoveContainer" containerID="4377b022c80187209305eeb20d8aadbd1cd96f857a3b52d5f041f2f6eacd9a51" Dec 03 23:50:50 crc kubenswrapper[4764]: I1203 23:50:50.873631 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:50:50 crc kubenswrapper[4764]: I1203 23:50:50.874155 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:51:04 crc kubenswrapper[4764]: I1203 23:51:04.964757 4764 scope.go:117] "RemoveContainer" containerID="6323439b1799b74f3d099a0efe93508fb38f55704644c69732d7e6de97c63176" Dec 03 23:51:04 crc kubenswrapper[4764]: I1203 23:51:04.991846 4764 scope.go:117] "RemoveContainer" containerID="2cb5c37f2cb10dded97449eeb3c2342f1d56fda7fbd51679cb02466c2f6cb21d" Dec 03 23:51:20 crc kubenswrapper[4764]: I1203 23:51:20.869443 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:51:20 crc kubenswrapper[4764]: I1203 23:51:20.870240 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:51:50 crc kubenswrapper[4764]: I1203 23:51:50.869103 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:51:50 crc kubenswrapper[4764]: I1203 23:51:50.869762 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:51:50 crc kubenswrapper[4764]: I1203 23:51:50.869833 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:51:50 crc kubenswrapper[4764]: I1203 23:51:50.871219 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:51:50 crc kubenswrapper[4764]: I1203 23:51:50.871350 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda" gracePeriod=600 Dec 03 23:51:51 crc kubenswrapper[4764]: I1203 23:51:51.686696 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda" exitCode=0 Dec 03 23:51:51 crc kubenswrapper[4764]: I1203 23:51:51.686756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda"} Dec 03 23:51:51 crc kubenswrapper[4764]: I1203 23:51:51.687007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b"} Dec 03 23:51:51 crc kubenswrapper[4764]: I1203 23:51:51.687034 4764 scope.go:117] "RemoveContainer" containerID="1c24f4c5aebcf81ce5b2876f342868df69c51bd15468e185acbae0af2aee2250" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.047142 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jc5ck"] Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048241 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-controller" containerID="cri-o://150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048333 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="nbdb" containerID="cri-o://166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048378 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048419 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-acl-logging" containerID="cri-o://56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048358 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="northd" containerID="cri-o://7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048447 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-node" containerID="cri-o://e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.048571 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="sbdb" containerID="cri-o://61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.099744 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" containerID="cri-o://28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c" gracePeriod=30 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.341922 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/2.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.343001 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/1.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.343094 4764 generic.go:334] "Generic (PLEG): container finished" podID="8789b456-ab23-4316-880d-5c02242cd3fd" containerID="3b7a10a11e2b6f0c7c42801239417c3406c7425f92d20b8cddf777b62c812032" exitCode=2 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.343196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerDied","Data":"3b7a10a11e2b6f0c7c42801239417c3406c7425f92d20b8cddf777b62c812032"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.343271 4764 scope.go:117] "RemoveContainer" containerID="dbcb485e05bcd738157b8c4db1ffb3c15c28d8e08f8ad785c11f27b73749bdf4" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.343849 4764 scope.go:117] "RemoveContainer" containerID="3b7a10a11e2b6f0c7c42801239417c3406c7425f92d20b8cddf777b62c812032" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.359438 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovnkube-controller/3.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.365671 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-acl-logging/0.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366221 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-controller/0.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366612 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c" exitCode=0 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366640 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8" exitCode=0 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366651 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e" exitCode=0 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366661 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95" exitCode=0 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366670 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8" exitCode=0 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366681 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66" exitCode=143 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366690 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2" exitCode=143 Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.366832 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2"} Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.385404 4764 scope.go:117] "RemoveContainer" containerID="ebb3b79a1e3c949b39fe81606855f655601c69586aa9475d20ce7a4a4b135112" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.413613 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-acl-logging/0.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.414799 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-controller/0.log" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.415284 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481406 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjntt\" (UniqueName: \"kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481548 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481629 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481650 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481749 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481926 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.481984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482030 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash\") pod \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\" (UID: \"9d56d81d-b8c8-43d2-a678-d34d2ae54e64\") " Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash" (OuterVolumeSpecName: "host-slash") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482592 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482638 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.482660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485380 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485450 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485506 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485584 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log" (OuterVolumeSpecName: "node-log") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485669 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket" (OuterVolumeSpecName: "log-socket") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485759 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485786 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.486299 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.485409 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.489573 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt" (OuterVolumeSpecName: "kube-api-access-hjntt") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "kube-api-access-hjntt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491583 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9kqcg"] Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491812 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491858 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491875 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="northd" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491890 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="northd" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-acl-logging" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491916 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-acl-logging" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491928 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491938 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491958 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-node" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491977 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-node" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.491986 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.491995 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492006 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="nbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492014 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="nbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492024 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kubecfg-setup" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492032 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kubecfg-setup" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492041 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="sbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492049 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="sbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492060 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" containerName="registry" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492068 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" containerName="registry" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492080 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492088 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492197 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-acl-logging" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492211 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492219 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="nbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492228 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492236 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="sbdb" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492244 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="northd" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492256 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492265 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="kube-rbac-proxy-node" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492272 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32eecc0-7e82-4d0b-bdbf-36fe53c01065" containerName="registry" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492283 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovn-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492294 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492302 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492404 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492413 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: E1203 23:53:29.492426 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.492530 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerName="ovnkube-controller" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.494274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.495100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.506311 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9d56d81d-b8c8-43d2-a678-d34d2ae54e64" (UID: "9d56d81d-b8c8-43d2-a678-d34d2ae54e64"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4d9q\" (UniqueName: \"kubernetes.io/projected/0016687c-2855-49ea-8cfa-383f398055b5-kube-api-access-n4d9q\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583807 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-bin\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-kubelet\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583903 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-netd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0016687c-2855-49ea-8cfa-383f398055b5-ovn-node-metrics-cert\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.583982 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-script-lib\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584021 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-etc-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-systemd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584058 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-slash\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-node-log\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-env-overrides\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-log-socket\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584197 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-systemd-units\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584234 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-var-lib-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-config\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-ovn\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584306 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-netns\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584341 4764 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584371 4764 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584379 4764 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584389 4764 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584397 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584406 4764 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584430 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjntt\" (UniqueName: \"kubernetes.io/projected/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-kube-api-access-hjntt\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584442 4764 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584450 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584458 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584466 4764 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584475 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584482 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584490 4764 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584498 4764 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584506 4764 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584516 4764 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584525 4764 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584534 4764 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.584543 4764 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9d56d81d-b8c8-43d2-a678-d34d2ae54e64-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.685788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0016687c-2855-49ea-8cfa-383f398055b5-ovn-node-metrics-cert\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-script-lib\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686544 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-etc-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-systemd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686688 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-slash\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-node-log\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-env-overrides\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.686997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-log-socket\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-systemd-units\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-var-lib-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687256 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-config\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-ovn\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687385 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-netns\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4d9q\" (UniqueName: \"kubernetes.io/projected/0016687c-2855-49ea-8cfa-383f398055b5-kube-api-access-n4d9q\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-bin\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687702 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-kubelet\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-systemd-units\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-netd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687979 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687946 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-systemd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-etc-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688069 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-slash\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-node-log\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-var-lib-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688190 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-bin\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-kubelet\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.687703 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-script-lib\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688533 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-cni-netd\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688610 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-log-socket\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0016687c-2855-49ea-8cfa-383f398055b5-ovn-node-metrics-cert\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688768 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-host-run-netns\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-ovnkube-config\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.688815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-ovn\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.689454 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0016687c-2855-49ea-8cfa-383f398055b5-env-overrides\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.689555 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0016687c-2855-49ea-8cfa-383f398055b5-run-openvswitch\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.732869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4d9q\" (UniqueName: \"kubernetes.io/projected/0016687c-2855-49ea-8cfa-383f398055b5-kube-api-access-n4d9q\") pod \"ovnkube-node-9kqcg\" (UID: \"0016687c-2855-49ea-8cfa-383f398055b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: I1203 23:53:29.816102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:29 crc kubenswrapper[4764]: W1203 23:53:29.835747 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0016687c_2855_49ea_8cfa_383f398055b5.slice/crio-692646e0d8d42facc9feeffee3e872946ac85aaa11a48ff0dc327866486e7db6 WatchSource:0}: Error finding container 692646e0d8d42facc9feeffee3e872946ac85aaa11a48ff0dc327866486e7db6: Status 404 returned error can't find the container with id 692646e0d8d42facc9feeffee3e872946ac85aaa11a48ff0dc327866486e7db6 Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.373474 4764 generic.go:334] "Generic (PLEG): container finished" podID="0016687c-2855-49ea-8cfa-383f398055b5" containerID="ff768433cc79d86144723ec38cc54bd5d2b206eb13eff6d73ce096c64a662f4e" exitCode=0 Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.373542 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerDied","Data":"ff768433cc79d86144723ec38cc54bd5d2b206eb13eff6d73ce096c64a662f4e"} Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.373881 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"692646e0d8d42facc9feeffee3e872946ac85aaa11a48ff0dc327866486e7db6"} Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.376330 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xj964_8789b456-ab23-4316-880d-5c02242cd3fd/kube-multus/2.log" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.376484 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xj964" event={"ID":"8789b456-ab23-4316-880d-5c02242cd3fd","Type":"ContainerStarted","Data":"29e409d9f4117a17491a88bbf874ef5657da4462327f18d427c8f8f0e0b54207"} Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.382160 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-acl-logging/0.log" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.382928 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jc5ck_9d56d81d-b8c8-43d2-a678-d34d2ae54e64/ovn-controller/0.log" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.383473 4764 generic.go:334] "Generic (PLEG): container finished" podID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" containerID="61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f" exitCode=0 Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.383651 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.394637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f"} Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.394753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jc5ck" event={"ID":"9d56d81d-b8c8-43d2-a678-d34d2ae54e64","Type":"ContainerDied","Data":"ab37222fc019385670800ba50f9ba9a585448678fa5b8ea48f92c60e20ff4306"} Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.394782 4764 scope.go:117] "RemoveContainer" containerID="28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.450470 4764 scope.go:117] "RemoveContainer" containerID="61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.471268 4764 scope.go:117] "RemoveContainer" containerID="166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.502928 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jc5ck"] Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.503603 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jc5ck"] Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.508255 4764 scope.go:117] "RemoveContainer" containerID="7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.521128 4764 scope.go:117] "RemoveContainer" containerID="eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.534328 4764 scope.go:117] "RemoveContainer" containerID="e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.551445 4764 scope.go:117] "RemoveContainer" containerID="56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.554866 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d56d81d-b8c8-43d2-a678-d34d2ae54e64" path="/var/lib/kubelet/pods/9d56d81d-b8c8-43d2-a678-d34d2ae54e64/volumes" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.570207 4764 scope.go:117] "RemoveContainer" containerID="150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.602414 4764 scope.go:117] "RemoveContainer" containerID="84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.624196 4764 scope.go:117] "RemoveContainer" containerID="28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.624734 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c\": container with ID starting with 28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c not found: ID does not exist" containerID="28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.624839 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c"} err="failed to get container status \"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c\": rpc error: code = NotFound desc = could not find container \"28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c\": container with ID starting with 28e57daf6056614fa0d6923265bcfddc7c9c863c9f85e12c7e9af801d1d3f42c not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.624930 4764 scope.go:117] "RemoveContainer" containerID="61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.625405 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\": container with ID starting with 61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f not found: ID does not exist" containerID="61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.625504 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f"} err="failed to get container status \"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\": rpc error: code = NotFound desc = could not find container \"61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f\": container with ID starting with 61a2ef5fe94c3fc23df490062dbb660ee219a389647e56691522cb57bf16119f not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.625590 4764 scope.go:117] "RemoveContainer" containerID="166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.626175 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\": container with ID starting with 166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8 not found: ID does not exist" containerID="166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.626223 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8"} err="failed to get container status \"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\": rpc error: code = NotFound desc = could not find container \"166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8\": container with ID starting with 166aaf7bda65a85a21e1d88af99d3d3f1df45d6b9308c63d7dbf0a8ea37deba8 not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.626256 4764 scope.go:117] "RemoveContainer" containerID="7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.626631 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\": container with ID starting with 7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e not found: ID does not exist" containerID="7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.626737 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e"} err="failed to get container status \"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\": rpc error: code = NotFound desc = could not find container \"7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e\": container with ID starting with 7c361ceaf4a9eca055014315e60973b14db74102aa4a302fc9f51488edf5d06e not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.626825 4764 scope.go:117] "RemoveContainer" containerID="eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.627246 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\": container with ID starting with eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95 not found: ID does not exist" containerID="eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.627357 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95"} err="failed to get container status \"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\": rpc error: code = NotFound desc = could not find container \"eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95\": container with ID starting with eac2b7ff076d75c0d96746c6622a5dc79e830934944c4cd130e7cecb16571a95 not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.627443 4764 scope.go:117] "RemoveContainer" containerID="e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.628472 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\": container with ID starting with e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8 not found: ID does not exist" containerID="e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.628562 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8"} err="failed to get container status \"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\": rpc error: code = NotFound desc = could not find container \"e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8\": container with ID starting with e7f43137443d8267cf3233741a9d6e572ad78c239ec57915d1c65a740f4150d8 not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.628646 4764 scope.go:117] "RemoveContainer" containerID="56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.633196 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\": container with ID starting with 56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66 not found: ID does not exist" containerID="56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.633298 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66"} err="failed to get container status \"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\": rpc error: code = NotFound desc = could not find container \"56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66\": container with ID starting with 56fced0e281cab4e73bd9a03f4738e719462c87a92f10f33ee091f8a25b85e66 not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.633404 4764 scope.go:117] "RemoveContainer" containerID="150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.634094 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\": container with ID starting with 150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2 not found: ID does not exist" containerID="150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.634206 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2"} err="failed to get container status \"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\": rpc error: code = NotFound desc = could not find container \"150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2\": container with ID starting with 150ccdbe41a0ce22c7704c6543b637ebaa6a10e8217e783f53432419bf1704f2 not found: ID does not exist" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.634301 4764 scope.go:117] "RemoveContainer" containerID="84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77" Dec 03 23:53:30 crc kubenswrapper[4764]: E1203 23:53:30.637032 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\": container with ID starting with 84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77 not found: ID does not exist" containerID="84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77" Dec 03 23:53:30 crc kubenswrapper[4764]: I1203 23:53:30.637326 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77"} err="failed to get container status \"84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\": rpc error: code = NotFound desc = could not find container \"84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77\": container with ID starting with 84e5c81dc35d127c37d9655e168b736cc3cfe7ddfba8698b223c94080eb9da77 not found: ID does not exist" Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.396551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"81b9374704f5a47543fe645d321ff765033b864ce04fc61740f014a61731433a"} Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.398159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"ebc2e4a9e4f1480623c5e5146a69a55e2c64dd8e3f9e95a3d68e70d76d648365"} Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.398200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"cf7d32753be2c2a1476d03db624cca6e120c1015d7d952fd7d5801f4e4f1dcf0"} Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.398218 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"b2382ecc0f443e675fc6b50d8f80d790b79e11014951e4e561fb6827f3878b1d"} Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.398236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"f484b453ba2f5b229f974052794e8e3358c58ab69cb04ed369ecdacdcc0ac0c8"} Dec 03 23:53:31 crc kubenswrapper[4764]: I1203 23:53:31.398256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"627b45f56cbf228585343e0aed1675da128c3606e4084ec1b9fa0073d1d0f0af"} Dec 03 23:53:33 crc kubenswrapper[4764]: I1203 23:53:33.428000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"6382ff22b3c5eb643e7c4b212f05b4efad3ca0b918ab34b448d166119f004227"} Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.462395 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" event={"ID":"0016687c-2855-49ea-8cfa-383f398055b5","Type":"ContainerStarted","Data":"32f4be8968e00ae8bc094316ee023604053ceb2a40cf38358ee8987f00966139"} Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.463687 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.463818 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.508530 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" podStartSLOduration=7.508508798 podStartE2EDuration="7.508508798s" podCreationTimestamp="2025-12-03 23:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:53:36.501326912 +0000 UTC m=+752.262651363" watchObservedRunningTime="2025-12-03 23:53:36.508508798 +0000 UTC m=+752.269833219" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.526943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.622478 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.629118 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.673027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rkn\" (UniqueName: \"kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.673243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.673369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.774210 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.774492 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rkn\" (UniqueName: \"kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.774672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.774806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.775418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.799459 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rkn\" (UniqueName: \"kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn\") pod \"certified-operators-8xlmg\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: I1203 23:53:36.954886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: E1203 23:53:36.982230 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(c5dbbb0c71e9a639c29d980761ee5c114523eb161ec6d975014e7908d49d80cd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:53:36 crc kubenswrapper[4764]: E1203 23:53:36.982305 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(c5dbbb0c71e9a639c29d980761ee5c114523eb161ec6d975014e7908d49d80cd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: E1203 23:53:36.982325 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(c5dbbb0c71e9a639c29d980761ee5c114523eb161ec6d975014e7908d49d80cd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:36 crc kubenswrapper[4764]: E1203 23:53:36.982360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-8xlmg_openshift-marketplace(07053195-0b45-49e2-8c63-ad9a547f0714)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-8xlmg_openshift-marketplace(07053195-0b45-49e2-8c63-ad9a547f0714)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(c5dbbb0c71e9a639c29d980761ee5c114523eb161ec6d975014e7908d49d80cd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-8xlmg" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.120820 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-x6gzw"] Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.121784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.125207 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.125263 4764 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrrvd" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.125499 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.127426 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.179195 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.179243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.179273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9dxs\" (UniqueName: \"kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.280883 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.281131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.281235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9dxs\" (UniqueName: \"kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.281456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.281760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.298152 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9dxs\" (UniqueName: \"kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs\") pod \"crc-storage-crc-x6gzw\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.433078 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.455226 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(324cd384789bd45ffb6857cc78bb806da789cad22f59044407db57410d3fa5e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.455290 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(324cd384789bd45ffb6857cc78bb806da789cad22f59044407db57410d3fa5e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.455309 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(324cd384789bd45ffb6857cc78bb806da789cad22f59044407db57410d3fa5e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.455367 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-x6gzw_crc-storage(9bc55aa0-c11a-4b89-a6bc-38d4967c5204)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-x6gzw_crc-storage(9bc55aa0-c11a-4b89-a6bc-38d4967c5204)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(324cd384789bd45ffb6857cc78bb806da789cad22f59044407db57410d3fa5e5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-x6gzw" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.467026 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.501984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.535096 4764 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.627325 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.627443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.627959 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.650470 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-x6gzw"] Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.650574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: I1203 23:53:37.651009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.656905 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(95bcaba69ca65203b85b43c435c2bc7c4c1f6f3642f29bcf65e82aa08bb5941f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.657079 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(95bcaba69ca65203b85b43c435c2bc7c4c1f6f3642f29bcf65e82aa08bb5941f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.657158 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(95bcaba69ca65203b85b43c435c2bc7c4c1f6f3642f29bcf65e82aa08bb5941f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.657257 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"certified-operators-8xlmg_openshift-marketplace(07053195-0b45-49e2-8c63-ad9a547f0714)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"certified-operators-8xlmg_openshift-marketplace(07053195-0b45-49e2-8c63-ad9a547f0714)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_certified-operators-8xlmg_openshift-marketplace_07053195-0b45-49e2-8c63-ad9a547f0714_0(95bcaba69ca65203b85b43c435c2bc7c4c1f6f3642f29bcf65e82aa08bb5941f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/certified-operators-8xlmg" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.676488 4764 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(7855ce86d7b7233635396e0fe5c0a8ba21cfe80ba53bb150e6666310932627b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.676563 4764 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(7855ce86d7b7233635396e0fe5c0a8ba21cfe80ba53bb150e6666310932627b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.676585 4764 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(7855ce86d7b7233635396e0fe5c0a8ba21cfe80ba53bb150e6666310932627b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:37 crc kubenswrapper[4764]: E1203 23:53:37.676645 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-x6gzw_crc-storage(9bc55aa0-c11a-4b89-a6bc-38d4967c5204)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-x6gzw_crc-storage(9bc55aa0-c11a-4b89-a6bc-38d4967c5204)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-x6gzw_crc-storage_9bc55aa0-c11a-4b89-a6bc-38d4967c5204_0(7855ce86d7b7233635396e0fe5c0a8ba21cfe80ba53bb150e6666310932627b0): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-x6gzw" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.001639 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.004978 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.024332 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.165979 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7glwh\" (UniqueName: \"kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.166057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.166113 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.267159 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7glwh\" (UniqueName: \"kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.267213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.267247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.267811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.267846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.294002 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7glwh\" (UniqueName: \"kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh\") pod \"redhat-operators-v2v7h\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.334018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:43 crc kubenswrapper[4764]: I1203 23:53:43.764431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:43 crc kubenswrapper[4764]: W1203 23:53:43.775863 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60228dd_d3ed_4242_9847_d138749d2da1.slice/crio-dfcb16c681bbbcb0ae16ea29beed4b265926a7ae96234eecf0c411108aeffa22 WatchSource:0}: Error finding container dfcb16c681bbbcb0ae16ea29beed4b265926a7ae96234eecf0c411108aeffa22: Status 404 returned error can't find the container with id dfcb16c681bbbcb0ae16ea29beed4b265926a7ae96234eecf0c411108aeffa22 Dec 03 23:53:44 crc kubenswrapper[4764]: I1203 23:53:44.517563 4764 generic.go:334] "Generic (PLEG): container finished" podID="b60228dd-d3ed-4242-9847-d138749d2da1" containerID="f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b" exitCode=0 Dec 03 23:53:44 crc kubenswrapper[4764]: I1203 23:53:44.517703 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerDied","Data":"f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b"} Dec 03 23:53:44 crc kubenswrapper[4764]: I1203 23:53:44.518164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerStarted","Data":"dfcb16c681bbbcb0ae16ea29beed4b265926a7ae96234eecf0c411108aeffa22"} Dec 03 23:53:44 crc kubenswrapper[4764]: I1203 23:53:44.521203 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:53:45 crc kubenswrapper[4764]: I1203 23:53:45.531888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerStarted","Data":"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b"} Dec 03 23:53:46 crc kubenswrapper[4764]: I1203 23:53:46.541621 4764 generic.go:334] "Generic (PLEG): container finished" podID="b60228dd-d3ed-4242-9847-d138749d2da1" containerID="7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b" exitCode=0 Dec 03 23:53:46 crc kubenswrapper[4764]: I1203 23:53:46.541680 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerDied","Data":"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b"} Dec 03 23:53:47 crc kubenswrapper[4764]: I1203 23:53:47.549158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerStarted","Data":"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b"} Dec 03 23:53:47 crc kubenswrapper[4764]: I1203 23:53:47.568384 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v2v7h" podStartSLOduration=3.133400426 podStartE2EDuration="5.568359765s" podCreationTimestamp="2025-12-03 23:53:42 +0000 UTC" firstStartedPulling="2025-12-03 23:53:44.520044006 +0000 UTC m=+760.281368457" lastFinishedPulling="2025-12-03 23:53:46.955003345 +0000 UTC m=+762.716327796" observedRunningTime="2025-12-03 23:53:47.568022887 +0000 UTC m=+763.329347308" watchObservedRunningTime="2025-12-03 23:53:47.568359765 +0000 UTC m=+763.329684186" Dec 03 23:53:48 crc kubenswrapper[4764]: I1203 23:53:48.544972 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:48 crc kubenswrapper[4764]: I1203 23:53:48.545572 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:48 crc kubenswrapper[4764]: I1203 23:53:48.798876 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:53:49 crc kubenswrapper[4764]: I1203 23:53:49.573856 4764 generic.go:334] "Generic (PLEG): container finished" podID="07053195-0b45-49e2-8c63-ad9a547f0714" containerID="110042ad61a3df6b77674044fd7d0bc74910a895af3ead267799ff084160d309" exitCode=0 Dec 03 23:53:49 crc kubenswrapper[4764]: I1203 23:53:49.573926 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerDied","Data":"110042ad61a3df6b77674044fd7d0bc74910a895af3ead267799ff084160d309"} Dec 03 23:53:49 crc kubenswrapper[4764]: I1203 23:53:49.573957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerStarted","Data":"8f9139c31e213a5478e1987aa3baa14a7b89c15e44300843f50cb03525f02fb3"} Dec 03 23:53:50 crc kubenswrapper[4764]: I1203 23:53:50.545614 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:50 crc kubenswrapper[4764]: I1203 23:53:50.546361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:50 crc kubenswrapper[4764]: I1203 23:53:50.590450 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerStarted","Data":"482b5e302b8d8f82fac0b31ce41d49116e8de4f24e551b8a3a9f557444298633"} Dec 03 23:53:50 crc kubenswrapper[4764]: I1203 23:53:50.730273 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-x6gzw"] Dec 03 23:53:51 crc kubenswrapper[4764]: I1203 23:53:51.600292 4764 generic.go:334] "Generic (PLEG): container finished" podID="07053195-0b45-49e2-8c63-ad9a547f0714" containerID="482b5e302b8d8f82fac0b31ce41d49116e8de4f24e551b8a3a9f557444298633" exitCode=0 Dec 03 23:53:51 crc kubenswrapper[4764]: I1203 23:53:51.600422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerDied","Data":"482b5e302b8d8f82fac0b31ce41d49116e8de4f24e551b8a3a9f557444298633"} Dec 03 23:53:51 crc kubenswrapper[4764]: I1203 23:53:51.610095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x6gzw" event={"ID":"9bc55aa0-c11a-4b89-a6bc-38d4967c5204","Type":"ContainerStarted","Data":"4675d8fb62b9f325bf11a2ec650e49b8cea8a79ece88f0a1be6b35eb0726e9b5"} Dec 03 23:53:52 crc kubenswrapper[4764]: I1203 23:53:52.618888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerStarted","Data":"e3f6a557dda880f683deda92b44d3cff5368fbeece001670ad8ed190463004f2"} Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.334625 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.335002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.383917 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.403055 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8xlmg" podStartSLOduration=14.562025745 podStartE2EDuration="17.403038728s" podCreationTimestamp="2025-12-03 23:53:36 +0000 UTC" firstStartedPulling="2025-12-03 23:53:49.576598774 +0000 UTC m=+765.337923205" lastFinishedPulling="2025-12-03 23:53:52.417611777 +0000 UTC m=+768.178936188" observedRunningTime="2025-12-03 23:53:52.644097284 +0000 UTC m=+768.405421695" watchObservedRunningTime="2025-12-03 23:53:53.403038728 +0000 UTC m=+769.164363139" Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.625595 4764 generic.go:334] "Generic (PLEG): container finished" podID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" containerID="ebfb6447773808600dfd68ef4a8ebc888911b18335cb3bd557ac538309ce7f02" exitCode=0 Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.625694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x6gzw" event={"ID":"9bc55aa0-c11a-4b89-a6bc-38d4967c5204","Type":"ContainerDied","Data":"ebfb6447773808600dfd68ef4a8ebc888911b18335cb3bd557ac538309ce7f02"} Dec 03 23:53:53 crc kubenswrapper[4764]: I1203 23:53:53.689810 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:54 crc kubenswrapper[4764]: I1203 23:53:54.895570 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.031323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage\") pod \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.031411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9dxs\" (UniqueName: \"kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs\") pod \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.031444 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt\") pod \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\" (UID: \"9bc55aa0-c11a-4b89-a6bc-38d4967c5204\") " Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.031709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9bc55aa0-c11a-4b89-a6bc-38d4967c5204" (UID: "9bc55aa0-c11a-4b89-a6bc-38d4967c5204"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.037262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs" (OuterVolumeSpecName: "kube-api-access-n9dxs") pod "9bc55aa0-c11a-4b89-a6bc-38d4967c5204" (UID: "9bc55aa0-c11a-4b89-a6bc-38d4967c5204"). InnerVolumeSpecName "kube-api-access-n9dxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.048081 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9bc55aa0-c11a-4b89-a6bc-38d4967c5204" (UID: "9bc55aa0-c11a-4b89-a6bc-38d4967c5204"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.134952 4764 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.135027 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9dxs\" (UniqueName: \"kubernetes.io/projected/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-kube-api-access-n9dxs\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.135059 4764 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9bc55aa0-c11a-4b89-a6bc-38d4967c5204-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.371828 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.641589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-x6gzw" event={"ID":"9bc55aa0-c11a-4b89-a6bc-38d4967c5204","Type":"ContainerDied","Data":"4675d8fb62b9f325bf11a2ec650e49b8cea8a79ece88f0a1be6b35eb0726e9b5"} Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.641666 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4675d8fb62b9f325bf11a2ec650e49b8cea8a79ece88f0a1be6b35eb0726e9b5" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.641614 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-x6gzw" Dec 03 23:53:55 crc kubenswrapper[4764]: I1203 23:53:55.641843 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v2v7h" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="registry-server" containerID="cri-o://2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b" gracePeriod=2 Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.047222 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.148594 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities\") pod \"b60228dd-d3ed-4242-9847-d138749d2da1\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.148656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content\") pod \"b60228dd-d3ed-4242-9847-d138749d2da1\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.148758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7glwh\" (UniqueName: \"kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh\") pod \"b60228dd-d3ed-4242-9847-d138749d2da1\" (UID: \"b60228dd-d3ed-4242-9847-d138749d2da1\") " Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.149824 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities" (OuterVolumeSpecName: "utilities") pod "b60228dd-d3ed-4242-9847-d138749d2da1" (UID: "b60228dd-d3ed-4242-9847-d138749d2da1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.153255 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh" (OuterVolumeSpecName: "kube-api-access-7glwh") pod "b60228dd-d3ed-4242-9847-d138749d2da1" (UID: "b60228dd-d3ed-4242-9847-d138749d2da1"). InnerVolumeSpecName "kube-api-access-7glwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.250519 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7glwh\" (UniqueName: \"kubernetes.io/projected/b60228dd-d3ed-4242-9847-d138749d2da1-kube-api-access-7glwh\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.250760 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.652201 4764 generic.go:334] "Generic (PLEG): container finished" podID="b60228dd-d3ed-4242-9847-d138749d2da1" containerID="2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b" exitCode=0 Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.652290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v2v7h" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.652289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerDied","Data":"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b"} Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.652833 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v2v7h" event={"ID":"b60228dd-d3ed-4242-9847-d138749d2da1","Type":"ContainerDied","Data":"dfcb16c681bbbcb0ae16ea29beed4b265926a7ae96234eecf0c411108aeffa22"} Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.652869 4764 scope.go:117] "RemoveContainer" containerID="2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.676927 4764 scope.go:117] "RemoveContainer" containerID="7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.700072 4764 scope.go:117] "RemoveContainer" containerID="f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.734396 4764 scope.go:117] "RemoveContainer" containerID="2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b" Dec 03 23:53:56 crc kubenswrapper[4764]: E1203 23:53:56.735237 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b\": container with ID starting with 2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b not found: ID does not exist" containerID="2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.735285 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b"} err="failed to get container status \"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b\": rpc error: code = NotFound desc = could not find container \"2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b\": container with ID starting with 2cc860211979b92e1dee67416869690aca9f37938a29bd098500202eade7a29b not found: ID does not exist" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.735361 4764 scope.go:117] "RemoveContainer" containerID="7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b" Dec 03 23:53:56 crc kubenswrapper[4764]: E1203 23:53:56.736566 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b\": container with ID starting with 7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b not found: ID does not exist" containerID="7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.736656 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b"} err="failed to get container status \"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b\": rpc error: code = NotFound desc = could not find container \"7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b\": container with ID starting with 7f23f6cee2533f992f513ff30539dd0357c01d60c4d5cd86bd5f178c7605c44b not found: ID does not exist" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.736766 4764 scope.go:117] "RemoveContainer" containerID="f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b" Dec 03 23:53:56 crc kubenswrapper[4764]: E1203 23:53:56.738633 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b\": container with ID starting with f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b not found: ID does not exist" containerID="f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.738706 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b"} err="failed to get container status \"f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b\": rpc error: code = NotFound desc = could not find container \"f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b\": container with ID starting with f6cb3012dfe3ddbad16de2df6f2f501d99244d91d87fec69402a5db0fbdd0e6b not found: ID does not exist" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.955705 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:56 crc kubenswrapper[4764]: I1203 23:53:56.956387 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:57 crc kubenswrapper[4764]: I1203 23:53:57.032569 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:57 crc kubenswrapper[4764]: I1203 23:53:57.717277 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.152184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b60228dd-d3ed-4242-9847-d138749d2da1" (UID: "b60228dd-d3ed-4242-9847-d138749d2da1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.180851 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b60228dd-d3ed-4242-9847-d138749d2da1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.186309 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.190663 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v2v7h"] Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.560145 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" path="/var/lib/kubelet/pods/b60228dd-d3ed-4242-9847-d138749d2da1/volumes" Dec 03 23:53:58 crc kubenswrapper[4764]: I1203 23:53:58.773763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:53:59 crc kubenswrapper[4764]: I1203 23:53:59.858153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9kqcg" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.607507 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.607932 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-567bg" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="registry-server" containerID="cri-o://0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" gracePeriod=30 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.625780 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.626398 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-579hd" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="registry-server" containerID="cri-o://0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" gracePeriod=30 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.644824 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.645186 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" podUID="b613c3f6-59f7-46b1-90ba-09793e962453" containerName="marketplace-operator" containerID="cri-o://d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077" gracePeriod=30 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.654184 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.654438 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d4vk9" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="registry-server" containerID="cri-o://a4cf9da8c6785ca4ac0659d435c7d3297e9503835bdd3640b6cb7f6b2a485c27" gracePeriod=30 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.661456 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zl7q7"] Dec 03 23:54:00 crc kubenswrapper[4764]: E1203 23:54:00.661821 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="extract-utilities" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.661841 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="extract-utilities" Dec 03 23:54:00 crc kubenswrapper[4764]: E1203 23:54:00.661858 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="extract-content" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.661868 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="extract-content" Dec 03 23:54:00 crc kubenswrapper[4764]: E1203 23:54:00.661884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="registry-server" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.661894 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="registry-server" Dec 03 23:54:00 crc kubenswrapper[4764]: E1203 23:54:00.661907 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" containerName="storage" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.661916 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" containerName="storage" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.662073 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b60228dd-d3ed-4242-9847-d138749d2da1" containerName="registry-server" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.662094 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" containerName="storage" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.662587 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.672962 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.673239 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4q9cs" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="registry-server" containerID="cri-o://676d2771f19ef22634b8c77a3e146bdd263fdcecf537e9bf91485cd8f6176d02" gracePeriod=30 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.677380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zl7q7"] Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.679085 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8xlmg" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="registry-server" containerID="cri-o://e3f6a557dda880f683deda92b44d3cff5368fbeece001670ad8ed190463004f2" gracePeriod=2 Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.719387 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.719456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nm9c\" (UniqueName: \"kubernetes.io/projected/b1e53002-80dd-456b-8da8-e7dc634450d8-kube-api-access-5nm9c\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.719477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.820646 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.820739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nm9c\" (UniqueName: \"kubernetes.io/projected/b1e53002-80dd-456b-8da8-e7dc634450d8-kube-api-access-5nm9c\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.820762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.822039 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.830570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b1e53002-80dd-456b-8da8-e7dc634450d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.836233 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nm9c\" (UniqueName: \"kubernetes.io/projected/b1e53002-80dd-456b-8da8-e7dc634450d8-kube-api-access-5nm9c\") pod \"marketplace-operator-79b997595-zl7q7\" (UID: \"b1e53002-80dd-456b-8da8-e7dc634450d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:00 crc kubenswrapper[4764]: I1203 23:54:00.993513 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:01 crc kubenswrapper[4764]: I1203 23:54:01.489419 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zl7q7"] Dec 03 23:54:01 crc kubenswrapper[4764]: I1203 23:54:01.686482 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" event={"ID":"b1e53002-80dd-456b-8da8-e7dc634450d8","Type":"ContainerStarted","Data":"5fca5ade16861737d6f0df7220c8864a761aaf3389a9429599f67012db0c7620"} Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.226459 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 is running failed: container process not found" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.227072 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 is running failed: container process not found" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.231919 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 is running failed: container process not found" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.231987 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-567bg" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="registry-server" Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.462755 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca is running failed: container process not found" containerID="0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.463436 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca is running failed: container process not found" containerID="0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.463885 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca is running failed: container process not found" containerID="0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 23:54:02 crc kubenswrapper[4764]: E1203 23:54:02.463967 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-579hd" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="registry-server" Dec 03 23:54:02 crc kubenswrapper[4764]: I1203 23:54:02.972788 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:02 crc kubenswrapper[4764]: I1203 23:54:02.974761 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:02 crc kubenswrapper[4764]: I1203 23:54:02.987164 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.048933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.049001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.049047 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhqh\" (UniqueName: \"kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.150304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.150398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.150460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhqh\" (UniqueName: \"kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.151269 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.151670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.177803 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.177964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhqh\" (UniqueName: \"kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh\") pod \"community-operators-l42gq\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.179175 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.191108 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.251767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzph\" (UniqueName: \"kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.251851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.251880 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.295405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.353628 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzph\" (UniqueName: \"kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.353733 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.353765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.354896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.357337 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.373565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzph\" (UniqueName: \"kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph\") pod \"redhat-marketplace-8zpdm\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.524665 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:03 crc kubenswrapper[4764]: W1203 23:54:03.549879 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9398cf_e8ac_4bb6_b3ac_670859827874.slice/crio-ca45b7343f6975576200535e2a22609611b58e0a6a9e31995d4d6883b5794aa7 WatchSource:0}: Error finding container ca45b7343f6975576200535e2a22609611b58e0a6a9e31995d4d6883b5794aa7: Status 404 returned error can't find the container with id ca45b7343f6975576200535e2a22609611b58e0a6a9e31995d4d6883b5794aa7 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.609368 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.639652 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.691891 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.701607 4764 generic.go:334] "Generic (PLEG): container finished" podID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerID="a4cf9da8c6785ca4ac0659d435c7d3297e9503835bdd3640b6cb7f6b2a485c27" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.701703 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerDied","Data":"a4cf9da8c6785ca4ac0659d435c7d3297e9503835bdd3640b6cb7f6b2a485c27"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.704383 4764 generic.go:334] "Generic (PLEG): container finished" podID="07053195-0b45-49e2-8c63-ad9a547f0714" containerID="e3f6a557dda880f683deda92b44d3cff5368fbeece001670ad8ed190463004f2" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.704440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerDied","Data":"e3f6a557dda880f683deda92b44d3cff5368fbeece001670ad8ed190463004f2"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.709655 4764 generic.go:334] "Generic (PLEG): container finished" podID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.709763 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-567bg" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.709768 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerDied","Data":"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.709837 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-567bg" event={"ID":"e437244c-a1f6-4f74-bfc6-8eb8366719d4","Type":"ContainerDied","Data":"8c08ade6d0a7667d4682bcd8cd68dde2bdc792961cf55a4d2b7dea7ae0b52103"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.709872 4764 scope.go:117] "RemoveContainer" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.714617 4764 generic.go:334] "Generic (PLEG): container finished" podID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerID="0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.714700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerDied","Data":"0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.718493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerStarted","Data":"75d9fdeab297bac18b64e653c45752f503f9d4b8e6d2e1292092b68ed19609ac"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.718532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerStarted","Data":"ca45b7343f6975576200535e2a22609611b58e0a6a9e31995d4d6883b5794aa7"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.722417 4764 generic.go:334] "Generic (PLEG): container finished" podID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerID="676d2771f19ef22634b8c77a3e146bdd263fdcecf537e9bf91485cd8f6176d02" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.722481 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerDied","Data":"676d2771f19ef22634b8c77a3e146bdd263fdcecf537e9bf91485cd8f6176d02"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.725400 4764 generic.go:334] "Generic (PLEG): container finished" podID="b613c3f6-59f7-46b1-90ba-09793e962453" containerID="d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077" exitCode=0 Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.725473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" event={"ID":"b613c3f6-59f7-46b1-90ba-09793e962453","Type":"ContainerDied","Data":"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.725482 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.725497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7mzk5" event={"ID":"b613c3f6-59f7-46b1-90ba-09793e962453","Type":"ContainerDied","Data":"780a3ccb1c4dacbdf4c5d027449808cadcc62d4faf4b59debec43949ce302f69"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.728590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" event={"ID":"b1e53002-80dd-456b-8da8-e7dc634450d8","Type":"ContainerStarted","Data":"0b663e61c4bc1979737d43ee5a5db53f66d034d769cc8af13b3bd69860588a8e"} Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.728865 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.736833 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760518 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics\") pod \"b613c3f6-59f7-46b1-90ba-09793e962453\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca\") pod \"b613c3f6-59f7-46b1-90ba-09793e962453\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv467\" (UniqueName: \"kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467\") pod \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs4bs\" (UniqueName: \"kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs\") pod \"b613c3f6-59f7-46b1-90ba-09793e962453\" (UID: \"b613c3f6-59f7-46b1-90ba-09793e962453\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities\") pod \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.760790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content\") pod \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\" (UID: \"e437244c-a1f6-4f74-bfc6-8eb8366719d4\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.762504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities" (OuterVolumeSpecName: "utilities") pod "e437244c-a1f6-4f74-bfc6-8eb8366719d4" (UID: "e437244c-a1f6-4f74-bfc6-8eb8366719d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.762692 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.765787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b613c3f6-59f7-46b1-90ba-09793e962453" (UID: "b613c3f6-59f7-46b1-90ba-09793e962453"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.769137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467" (OuterVolumeSpecName: "kube-api-access-tv467") pod "e437244c-a1f6-4f74-bfc6-8eb8366719d4" (UID: "e437244c-a1f6-4f74-bfc6-8eb8366719d4"). InnerVolumeSpecName "kube-api-access-tv467". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.769155 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs" (OuterVolumeSpecName: "kube-api-access-fs4bs") pod "b613c3f6-59f7-46b1-90ba-09793e962453" (UID: "b613c3f6-59f7-46b1-90ba-09793e962453"). InnerVolumeSpecName "kube-api-access-fs4bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.771328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b613c3f6-59f7-46b1-90ba-09793e962453" (UID: "b613c3f6-59f7-46b1-90ba-09793e962453"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.772936 4764 scope.go:117] "RemoveContainer" containerID="783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.816932 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.824879 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.825275 4764 scope.go:117] "RemoveContainer" containerID="517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.828214 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.839647 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zl7q7" podStartSLOduration=3.83962497 podStartE2EDuration="3.83962497s" podCreationTimestamp="2025-12-03 23:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:54:03.76751151 +0000 UTC m=+779.528835921" watchObservedRunningTime="2025-12-03 23:54:03.83962497 +0000 UTC m=+779.600949391" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.847671 4764 scope.go:117] "RemoveContainer" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.848328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e437244c-a1f6-4f74-bfc6-8eb8366719d4" (UID: "e437244c-a1f6-4f74-bfc6-8eb8366719d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: E1203 23:54:03.849310 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563\": container with ID starting with 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 not found: ID does not exist" containerID="0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.849364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563"} err="failed to get container status \"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563\": rpc error: code = NotFound desc = could not find container \"0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563\": container with ID starting with 0470d2507a6ac5f3b89a1c2d8c6c4cde26be4ba0ac5235b18e204e8382bbd563 not found: ID does not exist" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.849390 4764 scope.go:117] "RemoveContainer" containerID="783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73" Dec 03 23:54:03 crc kubenswrapper[4764]: E1203 23:54:03.849668 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73\": container with ID starting with 783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73 not found: ID does not exist" containerID="783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.849691 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73"} err="failed to get container status \"783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73\": rpc error: code = NotFound desc = could not find container \"783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73\": container with ID starting with 783ad9e7cf91836107649ba69135c98b5d59c26ed90183ccc01cc740f2e53a73 not found: ID does not exist" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.849706 4764 scope.go:117] "RemoveContainer" containerID="517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8" Dec 03 23:54:03 crc kubenswrapper[4764]: E1203 23:54:03.862141 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8\": container with ID starting with 517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8 not found: ID does not exist" containerID="517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.862192 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8"} err="failed to get container status \"517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8\": rpc error: code = NotFound desc = could not find container \"517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8\": container with ID starting with 517937a60ecd6b5cc75a31f47f2a64fe4f5bee2ed97ad3acfb5057c459d649b8 not found: ID does not exist" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.862224 4764 scope.go:117] "RemoveContainer" containerID="d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.864695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities\") pod \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.864832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwkrk\" (UniqueName: \"kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk\") pod \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.864899 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content\") pod \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.864939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities\") pod \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.864999 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content\") pod \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\" (UID: \"cf0a4d00-e173-46ad-9332-7b8cf8801cb3\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.865042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v7wb\" (UniqueName: \"kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb\") pod \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\" (UID: \"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities" (OuterVolumeSpecName: "utilities") pod "cf0a4d00-e173-46ad-9332-7b8cf8801cb3" (UID: "cf0a4d00-e173-46ad-9332-7b8cf8801cb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867340 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867361 4764 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c3f6-59f7-46b1-90ba-09793e962453-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867373 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fs4bs\" (UniqueName: \"kubernetes.io/projected/b613c3f6-59f7-46b1-90ba-09793e962453-kube-api-access-fs4bs\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867388 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv467\" (UniqueName: \"kubernetes.io/projected/e437244c-a1f6-4f74-bfc6-8eb8366719d4-kube-api-access-tv467\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.867398 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e437244c-a1f6-4f74-bfc6-8eb8366719d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.869435 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities" (OuterVolumeSpecName: "utilities") pod "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" (UID: "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.871443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk" (OuterVolumeSpecName: "kube-api-access-hwkrk") pod "cf0a4d00-e173-46ad-9332-7b8cf8801cb3" (UID: "cf0a4d00-e173-46ad-9332-7b8cf8801cb3"). InnerVolumeSpecName "kube-api-access-hwkrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.883988 4764 scope.go:117] "RemoveContainer" containerID="d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077" Dec 03 23:54:03 crc kubenswrapper[4764]: E1203 23:54:03.884531 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077\": container with ID starting with d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077 not found: ID does not exist" containerID="d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.884628 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077"} err="failed to get container status \"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077\": rpc error: code = NotFound desc = could not find container \"d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077\": container with ID starting with d39f9852e3b381218a3fa3ef81c8aeec6ea8675380e82ecdc220875d20c3c077 not found: ID does not exist" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.891704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb" (OuterVolumeSpecName: "kube-api-access-8v7wb") pod "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" (UID: "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db"). InnerVolumeSpecName "kube-api-access-8v7wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.896996 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" (UID: "bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.902195 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.926579 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf0a4d00-e173-46ad-9332-7b8cf8801cb3" (UID: "cf0a4d00-e173-46ad-9332-7b8cf8801cb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities\") pod \"07053195-0b45-49e2-8c63-ad9a547f0714\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9rkn\" (UniqueName: \"kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn\") pod \"07053195-0b45-49e2-8c63-ad9a547f0714\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities\") pod \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968657 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content\") pod \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content\") pod \"07053195-0b45-49e2-8c63-ad9a547f0714\" (UID: \"07053195-0b45-49e2-8c63-ad9a547f0714\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.968777 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt7km\" (UniqueName: \"kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km\") pod \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\" (UID: \"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce\") " Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969276 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969317 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v7wb\" (UniqueName: \"kubernetes.io/projected/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-kube-api-access-8v7wb\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969330 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969343 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969356 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwkrk\" (UniqueName: \"kubernetes.io/projected/cf0a4d00-e173-46ad-9332-7b8cf8801cb3-kube-api-access-hwkrk\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969368 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.969746 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities" (OuterVolumeSpecName: "utilities") pod "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" (UID: "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.971305 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km" (OuterVolumeSpecName: "kube-api-access-bt7km") pod "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" (UID: "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce"). InnerVolumeSpecName "kube-api-access-bt7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.975301 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities" (OuterVolumeSpecName: "utilities") pod "07053195-0b45-49e2-8c63-ad9a547f0714" (UID: "07053195-0b45-49e2-8c63-ad9a547f0714"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:03 crc kubenswrapper[4764]: I1203 23:54:03.976946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn" (OuterVolumeSpecName: "kube-api-access-d9rkn") pod "07053195-0b45-49e2-8c63-ad9a547f0714" (UID: "07053195-0b45-49e2-8c63-ad9a547f0714"). InnerVolumeSpecName "kube-api-access-d9rkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.017041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07053195-0b45-49e2-8c63-ad9a547f0714" (UID: "07053195-0b45-49e2-8c63-ad9a547f0714"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.038279 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.041398 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-567bg"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.053884 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.056637 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7mzk5"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.070083 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.070105 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt7km\" (UniqueName: \"kubernetes.io/projected/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-kube-api-access-bt7km\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.070116 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07053195-0b45-49e2-8c63-ad9a547f0714-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.070126 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9rkn\" (UniqueName: \"kubernetes.io/projected/07053195-0b45-49e2-8c63-ad9a547f0714-kube-api-access-d9rkn\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.070135 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.086914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" (UID: "58504f1f-ebbc-4c05-a4fd-f68cfa3609ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.115261 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:04 crc kubenswrapper[4764]: W1203 23:54:04.120108 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda682d811_4c67_413d_b035_1e06ca723876.slice/crio-012e4aef3de40d273e4b5ec6931c1f9c1bba0fdae4416cd8523dcfb3206c2aa5 WatchSource:0}: Error finding container 012e4aef3de40d273e4b5ec6931c1f9c1bba0fdae4416cd8523dcfb3206c2aa5: Status 404 returned error can't find the container with id 012e4aef3de40d273e4b5ec6931c1f9c1bba0fdae4416cd8523dcfb3206c2aa5 Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.170856 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.559262 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b613c3f6-59f7-46b1-90ba-09793e962453" path="/var/lib/kubelet/pods/b613c3f6-59f7-46b1-90ba-09793e962453/volumes" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.560094 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" path="/var/lib/kubelet/pods/e437244c-a1f6-4f74-bfc6-8eb8366719d4/volumes" Dec 03 23:54:04 crc kubenswrapper[4764]: E1203 23:54:04.677902 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbe4cf1_d03d_4ca0_b5a1_cc8384a882db.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58504f1f_ebbc_4c05_a4fd_f68cfa3609ce.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58504f1f_ebbc_4c05_a4fd_f68cfa3609ce.slice/crio-2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbe4cf1_d03d_4ca0_b5a1_cc8384a882db.slice/crio-f008e2a000be0f9c536930c973cb32481f2f875741ee79f8aa656940025e95cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9398cf_e8ac_4bb6_b3ac_670859827874.slice/crio-00227c8c6335f8ed487726a1f6a217695ae059d6c3229d4588e107183db9f924.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9398cf_e8ac_4bb6_b3ac_670859827874.slice/crio-conmon-00227c8c6335f8ed487726a1f6a217695ae059d6c3229d4588e107183db9f924.scope\": RecentStats: unable to find data in memory cache]" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.738056 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4vk9" event={"ID":"bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db","Type":"ContainerDied","Data":"f008e2a000be0f9c536930c973cb32481f2f875741ee79f8aa656940025e95cd"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.738517 4764 scope.go:117] "RemoveContainer" containerID="a4cf9da8c6785ca4ac0659d435c7d3297e9503835bdd3640b6cb7f6b2a485c27" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.738177 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4vk9" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.742152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xlmg" event={"ID":"07053195-0b45-49e2-8c63-ad9a547f0714","Type":"ContainerDied","Data":"8f9139c31e213a5478e1987aa3baa14a7b89c15e44300843f50cb03525f02fb3"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.742237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xlmg" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.752277 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-579hd" event={"ID":"cf0a4d00-e173-46ad-9332-7b8cf8801cb3","Type":"ContainerDied","Data":"1e974564b179831fbf30b60f6e58a786f4bcf29d2ca639556c0f11b77cdbac1d"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.752351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-579hd" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.754342 4764 generic.go:334] "Generic (PLEG): container finished" podID="a682d811-4c67-413d-b035-1e06ca723876" containerID="dfc707103734d4eab8ba7855effe11a1491f8cecf6bafb17192ebc654f789eed" exitCode=0 Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.754396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerDied","Data":"dfc707103734d4eab8ba7855effe11a1491f8cecf6bafb17192ebc654f789eed"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.754412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerStarted","Data":"012e4aef3de40d273e4b5ec6931c1f9c1bba0fdae4416cd8523dcfb3206c2aa5"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.761441 4764 generic.go:334] "Generic (PLEG): container finished" podID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerID="75d9fdeab297bac18b64e653c45752f503f9d4b8e6d2e1292092b68ed19609ac" exitCode=0 Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.761478 4764 generic.go:334] "Generic (PLEG): container finished" podID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerID="00227c8c6335f8ed487726a1f6a217695ae059d6c3229d4588e107183db9f924" exitCode=0 Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.761533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerDied","Data":"75d9fdeab297bac18b64e653c45752f503f9d4b8e6d2e1292092b68ed19609ac"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.761569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerDied","Data":"00227c8c6335f8ed487726a1f6a217695ae059d6c3229d4588e107183db9f924"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.768164 4764 scope.go:117] "RemoveContainer" containerID="9d60c3cd6a766f4a972178f30a57623bfba473f991a42c23502bb614e651df32" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.769085 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.771483 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4q9cs" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.772411 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4q9cs" event={"ID":"58504f1f-ebbc-4c05-a4fd-f68cfa3609ce","Type":"ContainerDied","Data":"2d73b33e9e6efa6aa0b783dcda998d53922309f81d5b4515b2366782c1f6daef"} Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.778246 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4vk9"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.787861 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.792708 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8xlmg"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.812248 4764 scope.go:117] "RemoveContainer" containerID="9281a039b8c25b036794888fbc3bd629b416cc75c0a587ef8549fcb5a040523a" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.839668 4764 scope.go:117] "RemoveContainer" containerID="e3f6a557dda880f683deda92b44d3cff5368fbeece001670ad8ed190463004f2" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.853353 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.859733 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-579hd"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.866758 4764 scope.go:117] "RemoveContainer" containerID="482b5e302b8d8f82fac0b31ce41d49116e8de4f24e551b8a3a9f557444298633" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.869401 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.872328 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4q9cs"] Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.882473 4764 scope.go:117] "RemoveContainer" containerID="110042ad61a3df6b77674044fd7d0bc74910a895af3ead267799ff084160d309" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.910318 4764 scope.go:117] "RemoveContainer" containerID="0f903619cebcd32d25fbd1adb22c69fe0a2d46ba6148d9b811186807c8dea4ca" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.929204 4764 scope.go:117] "RemoveContainer" containerID="6ebec57a5380ba2cec0c9f73de6734f6c26732f83ca84d87cb1f2485c061674a" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.944087 4764 scope.go:117] "RemoveContainer" containerID="2df392272ed38275455b0ec92abf69b534418a1ddc35ac7ad84f002640a8022a" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.959078 4764 scope.go:117] "RemoveContainer" containerID="676d2771f19ef22634b8c77a3e146bdd263fdcecf537e9bf91485cd8f6176d02" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.975019 4764 scope.go:117] "RemoveContainer" containerID="47fbb56ca12576a9bbf345ccee30b4e5b6fd3dc718754f8f3dcf59c1c2022da3" Dec 03 23:54:04 crc kubenswrapper[4764]: I1203 23:54:04.991355 4764 scope.go:117] "RemoveContainer" containerID="f8dfbab296b15bcad9ebe856603a8f2587ef299ee93f0bb29067e9f724ff654f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.377997 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-75x4f"] Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378284 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378304 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378323 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378334 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378353 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378364 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378380 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378411 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378421 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378436 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378446 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378458 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378468 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378478 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378489 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378506 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378517 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378531 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378543 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378557 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378568 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378584 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378594 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="extract-utilities" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378610 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378621 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378638 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378649 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378663 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378674 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="extract-content" Dec 03 23:54:05 crc kubenswrapper[4764]: E1203 23:54:05.378689 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b613c3f6-59f7-46b1-90ba-09793e962453" containerName="marketplace-operator" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378700 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b613c3f6-59f7-46b1-90ba-09793e962453" containerName="marketplace-operator" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378863 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378879 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e437244c-a1f6-4f74-bfc6-8eb8366719d4" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378896 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378917 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b613c3f6-59f7-46b1-90ba-09793e962453" containerName="marketplace-operator" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378931 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.378948 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" containerName="registry-server" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.380047 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.382207 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.401775 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75x4f"] Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.486046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-utilities\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.486122 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plww6\" (UniqueName: \"kubernetes.io/projected/72e5bec5-5dac-4d3b-999f-864ccb0a7595-kube-api-access-plww6\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.486158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-catalog-content\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.580683 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z65ht"] Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.582527 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.585801 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.587690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-utilities\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.587855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plww6\" (UniqueName: \"kubernetes.io/projected/72e5bec5-5dac-4d3b-999f-864ccb0a7595-kube-api-access-plww6\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.587960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-catalog-content\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.588663 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-catalog-content\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.589146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72e5bec5-5dac-4d3b-999f-864ccb0a7595-utilities\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.595640 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z65ht"] Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.612662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plww6\" (UniqueName: \"kubernetes.io/projected/72e5bec5-5dac-4d3b-999f-864ccb0a7595-kube-api-access-plww6\") pod \"redhat-operators-75x4f\" (UID: \"72e5bec5-5dac-4d3b-999f-864ccb0a7595\") " pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.689301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkx82\" (UniqueName: \"kubernetes.io/projected/8708be4a-2506-4174-b4a9-3e9627a6ce3c-kube-api-access-fkx82\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.689388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-utilities\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.689475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-catalog-content\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.713122 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.776977 4764 generic.go:334] "Generic (PLEG): container finished" podID="a682d811-4c67-413d-b035-1e06ca723876" containerID="7a831805b4554afb7a7bf4abbe6599ab5f507b7287b90e175adb0b4467cf6822" exitCode=0 Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.777054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerDied","Data":"7a831805b4554afb7a7bf4abbe6599ab5f507b7287b90e175adb0b4467cf6822"} Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.780803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerStarted","Data":"d4c845efb7a22c9cb63e3b048e669b7ea276a0fe5bb0f16b98567bdc12b767ac"} Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.791459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-catalog-content\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.791778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkx82\" (UniqueName: \"kubernetes.io/projected/8708be4a-2506-4174-b4a9-3e9627a6ce3c-kube-api-access-fkx82\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.791912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-utilities\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.793119 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-catalog-content\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.793284 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8708be4a-2506-4174-b4a9-3e9627a6ce3c-utilities\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.831604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l42gq" podStartSLOduration=2.331555211 podStartE2EDuration="3.831583858s" podCreationTimestamp="2025-12-03 23:54:02 +0000 UTC" firstStartedPulling="2025-12-03 23:54:03.720295572 +0000 UTC m=+779.481619983" lastFinishedPulling="2025-12-03 23:54:05.220324199 +0000 UTC m=+780.981648630" observedRunningTime="2025-12-03 23:54:05.829255271 +0000 UTC m=+781.590579722" watchObservedRunningTime="2025-12-03 23:54:05.831583858 +0000 UTC m=+781.592908279" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.832146 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkx82\" (UniqueName: \"kubernetes.io/projected/8708be4a-2506-4174-b4a9-3e9627a6ce3c-kube-api-access-fkx82\") pod \"certified-operators-z65ht\" (UID: \"8708be4a-2506-4174-b4a9-3e9627a6ce3c\") " pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.931761 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75x4f"] Dec 03 23:54:05 crc kubenswrapper[4764]: W1203 23:54:05.939674 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72e5bec5_5dac_4d3b_999f_864ccb0a7595.slice/crio-63c87e9beed1f1457a64406c1e491768a5c55d40b321b72b72ba50aaffb0c252 WatchSource:0}: Error finding container 63c87e9beed1f1457a64406c1e491768a5c55d40b321b72b72ba50aaffb0c252: Status 404 returned error can't find the container with id 63c87e9beed1f1457a64406c1e491768a5c55d40b321b72b72ba50aaffb0c252 Dec 03 23:54:05 crc kubenswrapper[4764]: I1203 23:54:05.960784 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.350496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z65ht"] Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.556766 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07053195-0b45-49e2-8c63-ad9a547f0714" path="/var/lib/kubelet/pods/07053195-0b45-49e2-8c63-ad9a547f0714/volumes" Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.558034 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58504f1f-ebbc-4c05-a4fd-f68cfa3609ce" path="/var/lib/kubelet/pods/58504f1f-ebbc-4c05-a4fd-f68cfa3609ce/volumes" Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.559267 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db" path="/var/lib/kubelet/pods/bfbe4cf1-d03d-4ca0-b5a1-cc8384a882db/volumes" Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.560804 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf0a4d00-e173-46ad-9332-7b8cf8801cb3" path="/var/lib/kubelet/pods/cf0a4d00-e173-46ad-9332-7b8cf8801cb3/volumes" Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.787809 4764 generic.go:334] "Generic (PLEG): container finished" podID="8708be4a-2506-4174-b4a9-3e9627a6ce3c" containerID="4383d5883fb278078df1566ee595d303678c98a9e0c469cfd681270f0f1ab221" exitCode=0 Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.787925 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z65ht" event={"ID":"8708be4a-2506-4174-b4a9-3e9627a6ce3c","Type":"ContainerDied","Data":"4383d5883fb278078df1566ee595d303678c98a9e0c469cfd681270f0f1ab221"} Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.787990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z65ht" event={"ID":"8708be4a-2506-4174-b4a9-3e9627a6ce3c","Type":"ContainerStarted","Data":"78f6a1d55a00a15bed68f6a9fa3087a5991985c82705d0449f10ff54690d1169"} Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.789992 4764 generic.go:334] "Generic (PLEG): container finished" podID="72e5bec5-5dac-4d3b-999f-864ccb0a7595" containerID="9cfea4da130d5027275e56f3cd8f2f968b4d5356a7d35856c8fe053eeaf1a4c6" exitCode=0 Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.790053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75x4f" event={"ID":"72e5bec5-5dac-4d3b-999f-864ccb0a7595","Type":"ContainerDied","Data":"9cfea4da130d5027275e56f3cd8f2f968b4d5356a7d35856c8fe053eeaf1a4c6"} Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.790071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75x4f" event={"ID":"72e5bec5-5dac-4d3b-999f-864ccb0a7595","Type":"ContainerStarted","Data":"63c87e9beed1f1457a64406c1e491768a5c55d40b321b72b72ba50aaffb0c252"} Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.797357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerStarted","Data":"2c0aa339a45d72360947a4263f16901ba4730c3e0b3f7308b11990e4c08d4272"} Dec 03 23:54:06 crc kubenswrapper[4764]: I1203 23:54:06.847797 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8zpdm" podStartSLOduration=2.407464551 podStartE2EDuration="3.847778344s" podCreationTimestamp="2025-12-03 23:54:03 +0000 UTC" firstStartedPulling="2025-12-03 23:54:04.768627345 +0000 UTC m=+780.529951756" lastFinishedPulling="2025-12-03 23:54:06.208941138 +0000 UTC m=+781.970265549" observedRunningTime="2025-12-03 23:54:06.840846714 +0000 UTC m=+782.602171135" watchObservedRunningTime="2025-12-03 23:54:06.847778344 +0000 UTC m=+782.609102775" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.782027 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.783812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.795049 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.810521 4764 generic.go:334] "Generic (PLEG): container finished" podID="8708be4a-2506-4174-b4a9-3e9627a6ce3c" containerID="d25c6186fd2a32bb2bf3b0727b9babd56111d4daa50567e3f2b02094f1396352" exitCode=0 Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.810584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z65ht" event={"ID":"8708be4a-2506-4174-b4a9-3e9627a6ce3c","Type":"ContainerDied","Data":"d25c6186fd2a32bb2bf3b0727b9babd56111d4daa50567e3f2b02094f1396352"} Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.814478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75x4f" event={"ID":"72e5bec5-5dac-4d3b-999f-864ccb0a7595","Type":"ContainerStarted","Data":"f38f0000d454393134d8e3fb3cf855762cf6b82296e2d7a44fdf188ded975fa9"} Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.935512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.935995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptfk\" (UniqueName: \"kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.936060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.985317 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k95g9"] Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.988882 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:07 crc kubenswrapper[4764]: I1203 23:54:07.994648 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k95g9"] Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.037041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nptfk\" (UniqueName: \"kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.037164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.037257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.037691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.037948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.063656 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptfk\" (UniqueName: \"kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk\") pod \"community-operators-865zg\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.110969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.138299 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6bc\" (UniqueName: \"kubernetes.io/projected/c7065eff-afd9-444c-8830-c58d4c4702c9-kube-api-access-wh6bc\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.138378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-utilities\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.138525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-catalog-content\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.239659 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh6bc\" (UniqueName: \"kubernetes.io/projected/c7065eff-afd9-444c-8830-c58d4c4702c9-kube-api-access-wh6bc\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.239836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-utilities\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.239957 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-catalog-content\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.240828 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-utilities\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.240935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7065eff-afd9-444c-8830-c58d4c4702c9-catalog-content\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.275897 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh6bc\" (UniqueName: \"kubernetes.io/projected/c7065eff-afd9-444c-8830-c58d4c4702c9-kube-api-access-wh6bc\") pod \"redhat-marketplace-k95g9\" (UID: \"c7065eff-afd9-444c-8830-c58d4c4702c9\") " pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.318538 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.410668 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.558867 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k95g9"] Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.821592 4764 generic.go:334] "Generic (PLEG): container finished" podID="947df551-a4ab-4b33-8c2f-1b535a557790" containerID="c693f4d795767afa7544be682bbb891277ec5f08419c6488ef5a6fbc57e9df9a" exitCode=0 Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.821646 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerDied","Data":"c693f4d795767afa7544be682bbb891277ec5f08419c6488ef5a6fbc57e9df9a"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.821795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerStarted","Data":"2cf9cf1a1b004e158e0affae589108f3b99f208f1d8af1d20e53ae78165703c5"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.826151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z65ht" event={"ID":"8708be4a-2506-4174-b4a9-3e9627a6ce3c","Type":"ContainerStarted","Data":"d90792254ce905b78c81a5a4c5769f4c722b5e25dfff831b5586d532cc6f5370"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.829492 4764 generic.go:334] "Generic (PLEG): container finished" podID="c7065eff-afd9-444c-8830-c58d4c4702c9" containerID="23ca1f2520a5a54823ddf72119d01dcd5eed9a31ed4bdcee236304cd9ade68c2" exitCode=0 Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.829685 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k95g9" event={"ID":"c7065eff-afd9-444c-8830-c58d4c4702c9","Type":"ContainerDied","Data":"23ca1f2520a5a54823ddf72119d01dcd5eed9a31ed4bdcee236304cd9ade68c2"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.829756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k95g9" event={"ID":"c7065eff-afd9-444c-8830-c58d4c4702c9","Type":"ContainerStarted","Data":"9b940f4900ac5678744256397f96627169c908ae9367a1e0da97a244c1110b6c"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.834845 4764 generic.go:334] "Generic (PLEG): container finished" podID="72e5bec5-5dac-4d3b-999f-864ccb0a7595" containerID="f38f0000d454393134d8e3fb3cf855762cf6b82296e2d7a44fdf188ded975fa9" exitCode=0 Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.834899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75x4f" event={"ID":"72e5bec5-5dac-4d3b-999f-864ccb0a7595","Type":"ContainerDied","Data":"f38f0000d454393134d8e3fb3cf855762cf6b82296e2d7a44fdf188ded975fa9"} Dec 03 23:54:08 crc kubenswrapper[4764]: I1203 23:54:08.872365 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z65ht" podStartSLOduration=2.438801595 podStartE2EDuration="3.872343262s" podCreationTimestamp="2025-12-03 23:54:05 +0000 UTC" firstStartedPulling="2025-12-03 23:54:06.790535169 +0000 UTC m=+782.551859610" lastFinishedPulling="2025-12-03 23:54:08.224076836 +0000 UTC m=+783.985401277" observedRunningTime="2025-12-03 23:54:08.864336636 +0000 UTC m=+784.625661047" watchObservedRunningTime="2025-12-03 23:54:08.872343262 +0000 UTC m=+784.633667683" Dec 03 23:54:09 crc kubenswrapper[4764]: I1203 23:54:09.844844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerStarted","Data":"aa2846c35039efa37416509ffb7daf24d37d2d3eb3ae7aecab345fb28f30ae63"} Dec 03 23:54:09 crc kubenswrapper[4764]: I1203 23:54:09.849075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k95g9" event={"ID":"c7065eff-afd9-444c-8830-c58d4c4702c9","Type":"ContainerStarted","Data":"1148146e67137ea8c4a3278420cb9c6170f770cb26baa8374a0bf7d601e15d83"} Dec 03 23:54:09 crc kubenswrapper[4764]: I1203 23:54:09.860674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75x4f" event={"ID":"72e5bec5-5dac-4d3b-999f-864ccb0a7595","Type":"ContainerStarted","Data":"1bd4b16b58a7bcd2a7c46f2d75e458bdb82421291ea8a427a7d9be02ea81e949"} Dec 03 23:54:09 crc kubenswrapper[4764]: I1203 23:54:09.934271 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-75x4f" podStartSLOduration=2.42711274 podStartE2EDuration="4.93424736s" podCreationTimestamp="2025-12-03 23:54:05 +0000 UTC" firstStartedPulling="2025-12-03 23:54:06.791415871 +0000 UTC m=+782.552740282" lastFinishedPulling="2025-12-03 23:54:09.298550441 +0000 UTC m=+785.059874902" observedRunningTime="2025-12-03 23:54:09.931279087 +0000 UTC m=+785.692603508" watchObservedRunningTime="2025-12-03 23:54:09.93424736 +0000 UTC m=+785.695571771" Dec 03 23:54:10 crc kubenswrapper[4764]: I1203 23:54:10.869474 4764 generic.go:334] "Generic (PLEG): container finished" podID="947df551-a4ab-4b33-8c2f-1b535a557790" containerID="aa2846c35039efa37416509ffb7daf24d37d2d3eb3ae7aecab345fb28f30ae63" exitCode=0 Dec 03 23:54:10 crc kubenswrapper[4764]: I1203 23:54:10.869797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerDied","Data":"aa2846c35039efa37416509ffb7daf24d37d2d3eb3ae7aecab345fb28f30ae63"} Dec 03 23:54:10 crc kubenswrapper[4764]: I1203 23:54:10.877403 4764 generic.go:334] "Generic (PLEG): container finished" podID="c7065eff-afd9-444c-8830-c58d4c4702c9" containerID="1148146e67137ea8c4a3278420cb9c6170f770cb26baa8374a0bf7d601e15d83" exitCode=0 Dec 03 23:54:10 crc kubenswrapper[4764]: I1203 23:54:10.877990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k95g9" event={"ID":"c7065eff-afd9-444c-8830-c58d4c4702c9","Type":"ContainerDied","Data":"1148146e67137ea8c4a3278420cb9c6170f770cb26baa8374a0bf7d601e15d83"} Dec 03 23:54:11 crc kubenswrapper[4764]: I1203 23:54:11.886158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerStarted","Data":"2f8088fe84d23c7a1a3fb472e6c00a4b47cebd6530ca96baad45f1e72ed3acda"} Dec 03 23:54:11 crc kubenswrapper[4764]: I1203 23:54:11.889579 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k95g9" event={"ID":"c7065eff-afd9-444c-8830-c58d4c4702c9","Type":"ContainerStarted","Data":"ba2917b09e2fccfd0e9ee133f244c9d88acfd5d6bbda049c07c833225a25e549"} Dec 03 23:54:11 crc kubenswrapper[4764]: I1203 23:54:11.931518 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k95g9" podStartSLOduration=2.169736679 podStartE2EDuration="4.931496568s" podCreationTimestamp="2025-12-03 23:54:07 +0000 UTC" firstStartedPulling="2025-12-03 23:54:08.832683959 +0000 UTC m=+784.594008420" lastFinishedPulling="2025-12-03 23:54:11.594443888 +0000 UTC m=+787.355768309" observedRunningTime="2025-12-03 23:54:11.930447602 +0000 UTC m=+787.691772053" watchObservedRunningTime="2025-12-03 23:54:11.931496568 +0000 UTC m=+787.692821019" Dec 03 23:54:11 crc kubenswrapper[4764]: I1203 23:54:11.933476 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-865zg" podStartSLOduration=2.127582764 podStartE2EDuration="4.933465816s" podCreationTimestamp="2025-12-03 23:54:07 +0000 UTC" firstStartedPulling="2025-12-03 23:54:08.823799361 +0000 UTC m=+784.585123812" lastFinishedPulling="2025-12-03 23:54:11.629682433 +0000 UTC m=+787.391006864" observedRunningTime="2025-12-03 23:54:11.913126657 +0000 UTC m=+787.674451098" watchObservedRunningTime="2025-12-03 23:54:11.933465816 +0000 UTC m=+787.694790267" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.296707 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.297755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.362265 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.610841 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.611119 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.678042 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.953413 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:13 crc kubenswrapper[4764]: I1203 23:54:13.958263 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.713410 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.714667 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.786126 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.961594 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.961695 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:15 crc kubenswrapper[4764]: I1203 23:54:15.978706 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-75x4f" Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.002935 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.169305 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.169953 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l42gq" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="registry-server" containerID="cri-o://d4c845efb7a22c9cb63e3b048e669b7ea276a0fe5bb0f16b98567bdc12b767ac" gracePeriod=2 Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.371883 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.372230 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8zpdm" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="registry-server" containerID="cri-o://2c0aa339a45d72360947a4263f16901ba4730c3e0b3f7308b11990e4c08d4272" gracePeriod=2 Dec 03 23:54:16 crc kubenswrapper[4764]: I1203 23:54:16.992907 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z65ht" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.112066 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.112187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.150102 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.319052 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.320032 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.388453 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.967614 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k95g9" Dec 03 23:54:18 crc kubenswrapper[4764]: I1203 23:54:18.968135 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-865zg" Dec 03 23:54:19 crc kubenswrapper[4764]: I1203 23:54:19.950540 4764 generic.go:334] "Generic (PLEG): container finished" podID="a682d811-4c67-413d-b035-1e06ca723876" containerID="2c0aa339a45d72360947a4263f16901ba4730c3e0b3f7308b11990e4c08d4272" exitCode=0 Dec 03 23:54:19 crc kubenswrapper[4764]: I1203 23:54:19.950619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerDied","Data":"2c0aa339a45d72360947a4263f16901ba4730c3e0b3f7308b11990e4c08d4272"} Dec 03 23:54:19 crc kubenswrapper[4764]: I1203 23:54:19.954974 4764 generic.go:334] "Generic (PLEG): container finished" podID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerID="d4c845efb7a22c9cb63e3b048e669b7ea276a0fe5bb0f16b98567bdc12b767ac" exitCode=0 Dec 03 23:54:19 crc kubenswrapper[4764]: I1203 23:54:19.955049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerDied","Data":"d4c845efb7a22c9cb63e3b048e669b7ea276a0fe5bb0f16b98567bdc12b767ac"} Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.417743 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.542985 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.569638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content\") pod \"6f9398cf-e8ac-4bb6-b3ac-670859827874\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.569850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhqh\" (UniqueName: \"kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh\") pod \"6f9398cf-e8ac-4bb6-b3ac-670859827874\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.570175 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities\") pod \"a682d811-4c67-413d-b035-1e06ca723876\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.570247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities\") pod \"6f9398cf-e8ac-4bb6-b3ac-670859827874\" (UID: \"6f9398cf-e8ac-4bb6-b3ac-670859827874\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.572088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities" (OuterVolumeSpecName: "utilities") pod "6f9398cf-e8ac-4bb6-b3ac-670859827874" (UID: "6f9398cf-e8ac-4bb6-b3ac-670859827874"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.572978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities" (OuterVolumeSpecName: "utilities") pod "a682d811-4c67-413d-b035-1e06ca723876" (UID: "a682d811-4c67-413d-b035-1e06ca723876"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.585916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh" (OuterVolumeSpecName: "kube-api-access-qjhqh") pod "6f9398cf-e8ac-4bb6-b3ac-670859827874" (UID: "6f9398cf-e8ac-4bb6-b3ac-670859827874"). InnerVolumeSpecName "kube-api-access-qjhqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.635929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9398cf-e8ac-4bb6-b3ac-670859827874" (UID: "6f9398cf-e8ac-4bb6-b3ac-670859827874"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.671494 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vzph\" (UniqueName: \"kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph\") pod \"a682d811-4c67-413d-b035-1e06ca723876\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.671633 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content\") pod \"a682d811-4c67-413d-b035-1e06ca723876\" (UID: \"a682d811-4c67-413d-b035-1e06ca723876\") " Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.672028 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.672070 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.672097 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9398cf-e8ac-4bb6-b3ac-670859827874-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.672128 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhqh\" (UniqueName: \"kubernetes.io/projected/6f9398cf-e8ac-4bb6-b3ac-670859827874-kube-api-access-qjhqh\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.675972 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph" (OuterVolumeSpecName: "kube-api-access-5vzph") pod "a682d811-4c67-413d-b035-1e06ca723876" (UID: "a682d811-4c67-413d-b035-1e06ca723876"). InnerVolumeSpecName "kube-api-access-5vzph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.687707 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a682d811-4c67-413d-b035-1e06ca723876" (UID: "a682d811-4c67-413d-b035-1e06ca723876"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.772512 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a682d811-4c67-413d-b035-1e06ca723876-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.772554 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vzph\" (UniqueName: \"kubernetes.io/projected/a682d811-4c67-413d-b035-1e06ca723876-kube-api-access-5vzph\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.868903 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.868997 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.967211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8zpdm" event={"ID":"a682d811-4c67-413d-b035-1e06ca723876","Type":"ContainerDied","Data":"012e4aef3de40d273e4b5ec6931c1f9c1bba0fdae4416cd8523dcfb3206c2aa5"} Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.967667 4764 scope.go:117] "RemoveContainer" containerID="2c0aa339a45d72360947a4263f16901ba4730c3e0b3f7308b11990e4c08d4272" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.967329 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8zpdm" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.970898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l42gq" event={"ID":"6f9398cf-e8ac-4bb6-b3ac-670859827874","Type":"ContainerDied","Data":"ca45b7343f6975576200535e2a22609611b58e0a6a9e31995d4d6883b5794aa7"} Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.971039 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l42gq" Dec 03 23:54:20 crc kubenswrapper[4764]: I1203 23:54:20.992394 4764 scope.go:117] "RemoveContainer" containerID="7a831805b4554afb7a7bf4abbe6599ab5f507b7287b90e175adb0b4467cf6822" Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.028541 4764 scope.go:117] "RemoveContainer" containerID="dfc707103734d4eab8ba7855effe11a1491f8cecf6bafb17192ebc654f789eed" Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.037363 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.041367 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8zpdm"] Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.044393 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.047260 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l42gq"] Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.055020 4764 scope.go:117] "RemoveContainer" containerID="d4c845efb7a22c9cb63e3b048e669b7ea276a0fe5bb0f16b98567bdc12b767ac" Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.072778 4764 scope.go:117] "RemoveContainer" containerID="00227c8c6335f8ed487726a1f6a217695ae059d6c3229d4588e107183db9f924" Dec 03 23:54:21 crc kubenswrapper[4764]: I1203 23:54:21.096483 4764 scope.go:117] "RemoveContainer" containerID="75d9fdeab297bac18b64e653c45752f503f9d4b8e6d2e1292092b68ed19609ac" Dec 03 23:54:22 crc kubenswrapper[4764]: I1203 23:54:22.552465 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" path="/var/lib/kubelet/pods/6f9398cf-e8ac-4bb6-b3ac-670859827874/volumes" Dec 03 23:54:22 crc kubenswrapper[4764]: I1203 23:54:22.553656 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a682d811-4c67-413d-b035-1e06ca723876" path="/var/lib/kubelet/pods/a682d811-4c67-413d-b035-1e06ca723876/volumes" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.446537 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx"] Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447328 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447344 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447357 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="extract-content" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447365 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="extract-content" Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="extract-content" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447385 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="extract-content" Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447401 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="extract-utilities" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447410 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="extract-utilities" Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447425 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="extract-utilities" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447433 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="extract-utilities" Dec 03 23:54:30 crc kubenswrapper[4764]: E1203 23:54:30.447447 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447456 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447574 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9398cf-e8ac-4bb6-b3ac-670859827874" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.447594 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a682d811-4c67-413d-b035-1e06ca723876" containerName="registry-server" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.448448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.451657 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.464362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx"] Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.609260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62scq\" (UniqueName: \"kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.609348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.609394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.710889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62scq\" (UniqueName: \"kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.710989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.711044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.711519 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.711747 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.742789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62scq\" (UniqueName: \"kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:30 crc kubenswrapper[4764]: I1203 23:54:30.765434 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:31 crc kubenswrapper[4764]: I1203 23:54:31.181339 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx"] Dec 03 23:54:32 crc kubenswrapper[4764]: I1203 23:54:32.051105 4764 generic.go:334] "Generic (PLEG): container finished" podID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerID="cbf8a05341f80ce969d88961327cd73e5bc95c8bdd8703328a20ff56b3b2fab6" exitCode=0 Dec 03 23:54:32 crc kubenswrapper[4764]: I1203 23:54:32.051171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" event={"ID":"d789b8c1-b96b-4809-8e37-1ca6b9b39adc","Type":"ContainerDied","Data":"cbf8a05341f80ce969d88961327cd73e5bc95c8bdd8703328a20ff56b3b2fab6"} Dec 03 23:54:32 crc kubenswrapper[4764]: I1203 23:54:32.051212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" event={"ID":"d789b8c1-b96b-4809-8e37-1ca6b9b39adc","Type":"ContainerStarted","Data":"cdfb6f8355ebe197c5d32637c3ba19b7fef56cc7f87a527b4e6d44653431f888"} Dec 03 23:54:34 crc kubenswrapper[4764]: I1203 23:54:34.067505 4764 generic.go:334] "Generic (PLEG): container finished" podID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerID="21aa5f69b7a4344388c56fbf850b9b1676d12b976733da92726072c4552639e0" exitCode=0 Dec 03 23:54:34 crc kubenswrapper[4764]: I1203 23:54:34.067563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" event={"ID":"d789b8c1-b96b-4809-8e37-1ca6b9b39adc","Type":"ContainerDied","Data":"21aa5f69b7a4344388c56fbf850b9b1676d12b976733da92726072c4552639e0"} Dec 03 23:54:35 crc kubenswrapper[4764]: I1203 23:54:35.080157 4764 generic.go:334] "Generic (PLEG): container finished" podID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerID="cc93182eef3d0dc175f00f2971edda292ce0b7649c61af73fcbcd576115f2942" exitCode=0 Dec 03 23:54:35 crc kubenswrapper[4764]: I1203 23:54:35.080352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" event={"ID":"d789b8c1-b96b-4809-8e37-1ca6b9b39adc","Type":"ContainerDied","Data":"cc93182eef3d0dc175f00f2971edda292ce0b7649c61af73fcbcd576115f2942"} Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.433150 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.591169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62scq\" (UniqueName: \"kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq\") pod \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.591245 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util\") pod \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.591297 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle\") pod \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\" (UID: \"d789b8c1-b96b-4809-8e37-1ca6b9b39adc\") " Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.591790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle" (OuterVolumeSpecName: "bundle") pod "d789b8c1-b96b-4809-8e37-1ca6b9b39adc" (UID: "d789b8c1-b96b-4809-8e37-1ca6b9b39adc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.592485 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.598088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq" (OuterVolumeSpecName: "kube-api-access-62scq") pod "d789b8c1-b96b-4809-8e37-1ca6b9b39adc" (UID: "d789b8c1-b96b-4809-8e37-1ca6b9b39adc"). InnerVolumeSpecName "kube-api-access-62scq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.688612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util" (OuterVolumeSpecName: "util") pod "d789b8c1-b96b-4809-8e37-1ca6b9b39adc" (UID: "d789b8c1-b96b-4809-8e37-1ca6b9b39adc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.693514 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62scq\" (UniqueName: \"kubernetes.io/projected/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-kube-api-access-62scq\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:36 crc kubenswrapper[4764]: I1203 23:54:36.693548 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d789b8c1-b96b-4809-8e37-1ca6b9b39adc-util\") on node \"crc\" DevicePath \"\"" Dec 03 23:54:37 crc kubenswrapper[4764]: I1203 23:54:37.094959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" event={"ID":"d789b8c1-b96b-4809-8e37-1ca6b9b39adc","Type":"ContainerDied","Data":"cdfb6f8355ebe197c5d32637c3ba19b7fef56cc7f87a527b4e6d44653431f888"} Dec 03 23:54:37 crc kubenswrapper[4764]: I1203 23:54:37.095012 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdfb6f8355ebe197c5d32637c3ba19b7fef56cc7f87a527b4e6d44653431f888" Dec 03 23:54:37 crc kubenswrapper[4764]: I1203 23:54:37.095042 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.992298 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg"] Dec 03 23:54:39 crc kubenswrapper[4764]: E1203 23:54:39.992864 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="extract" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.992881 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="extract" Dec 03 23:54:39 crc kubenswrapper[4764]: E1203 23:54:39.992895 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="pull" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.992902 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="pull" Dec 03 23:54:39 crc kubenswrapper[4764]: E1203 23:54:39.992918 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="util" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.992925 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="util" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.993029 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d789b8c1-b96b-4809-8e37-1ca6b9b39adc" containerName="extract" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.993452 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.996403 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 23:54:39 crc kubenswrapper[4764]: I1203 23:54:39.997491 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-ndbx8" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.006817 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.015317 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg"] Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.136409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvt5\" (UniqueName: \"kubernetes.io/projected/a3ba2f28-5fef-4cce-bd05-34da304861e9-kube-api-access-xsvt5\") pod \"nmstate-operator-5b5b58f5c8-wxdrg\" (UID: \"a3ba2f28-5fef-4cce-bd05-34da304861e9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.237906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvt5\" (UniqueName: \"kubernetes.io/projected/a3ba2f28-5fef-4cce-bd05-34da304861e9-kube-api-access-xsvt5\") pod \"nmstate-operator-5b5b58f5c8-wxdrg\" (UID: \"a3ba2f28-5fef-4cce-bd05-34da304861e9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.259989 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvt5\" (UniqueName: \"kubernetes.io/projected/a3ba2f28-5fef-4cce-bd05-34da304861e9-kube-api-access-xsvt5\") pod \"nmstate-operator-5b5b58f5c8-wxdrg\" (UID: \"a3ba2f28-5fef-4cce-bd05-34da304861e9\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.310330 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" Dec 03 23:54:40 crc kubenswrapper[4764]: I1203 23:54:40.584794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg"] Dec 03 23:54:41 crc kubenswrapper[4764]: I1203 23:54:41.121137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" event={"ID":"a3ba2f28-5fef-4cce-bd05-34da304861e9","Type":"ContainerStarted","Data":"1c6c8300830bc839a717dadf82dbe0ec7a54311fea16c17f4f475eae15eea70c"} Dec 03 23:54:43 crc kubenswrapper[4764]: I1203 23:54:43.134136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" event={"ID":"a3ba2f28-5fef-4cce-bd05-34da304861e9","Type":"ContainerStarted","Data":"d08e46681c5ebc404b6a66f707d7fcc22dd2065c89dccad6dfe42140e6ccf00d"} Dec 03 23:54:43 crc kubenswrapper[4764]: I1203 23:54:43.152586 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-wxdrg" podStartSLOduration=1.817779821 podStartE2EDuration="4.152564532s" podCreationTimestamp="2025-12-03 23:54:39 +0000 UTC" firstStartedPulling="2025-12-03 23:54:40.607971832 +0000 UTC m=+816.369296243" lastFinishedPulling="2025-12-03 23:54:42.942756543 +0000 UTC m=+818.704080954" observedRunningTime="2025-12-03 23:54:43.150028089 +0000 UTC m=+818.911352490" watchObservedRunningTime="2025-12-03 23:54:43.152564532 +0000 UTC m=+818.913888943" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.638667 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.639733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.641453 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-76zcw" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.656608 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.657700 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.660046 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.672076 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.677018 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.682744 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-rdwmt"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.683742 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-dbus-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703679 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfjwj\" (UniqueName: \"kubernetes.io/projected/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-kube-api-access-lfjwj\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-nmstate-lock\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblsc\" (UniqueName: \"kubernetes.io/projected/ef6a3342-1957-4648-9026-0c14fd0589d4-kube-api-access-nblsc\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.703963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-ovs-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.704167 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l48st\" (UniqueName: \"kubernetes.io/projected/e5bd5023-35b9-4646-b974-063e5f048d1b-kube-api-access-l48st\") pod \"nmstate-metrics-7f946cbc9-z7lsn\" (UID: \"e5bd5023-35b9-4646-b974-063e5f048d1b\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.777567 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.778259 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.783503 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.784884 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.788886 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2nm9l" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.793864 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l48st\" (UniqueName: \"kubernetes.io/projected/e5bd5023-35b9-4646-b974-063e5f048d1b-kube-api-access-l48st\") pod \"nmstate-metrics-7f946cbc9-z7lsn\" (UID: \"e5bd5023-35b9-4646-b974-063e5f048d1b\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5gq\" (UniqueName: \"kubernetes.io/projected/760538af-8a9b-4c70-9743-0ebd1799e6f4-kube-api-access-jr5gq\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-dbus-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/760538af-8a9b-4c70-9743-0ebd1799e6f4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfjwj\" (UniqueName: \"kubernetes.io/projected/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-kube-api-access-lfjwj\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-nmstate-lock\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806274 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblsc\" (UniqueName: \"kubernetes.io/projected/ef6a3342-1957-4648-9026-0c14fd0589d4-kube-api-access-nblsc\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-ovs-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806318 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.806948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-dbus-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.807412 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-ovs-socket\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.807436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef6a3342-1957-4648-9026-0c14fd0589d4-nmstate-lock\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.829066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.834960 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblsc\" (UniqueName: \"kubernetes.io/projected/ef6a3342-1957-4648-9026-0c14fd0589d4-kube-api-access-nblsc\") pod \"nmstate-handler-rdwmt\" (UID: \"ef6a3342-1957-4648-9026-0c14fd0589d4\") " pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.841567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l48st\" (UniqueName: \"kubernetes.io/projected/e5bd5023-35b9-4646-b974-063e5f048d1b-kube-api-access-l48st\") pod \"nmstate-metrics-7f946cbc9-z7lsn\" (UID: \"e5bd5023-35b9-4646-b974-063e5f048d1b\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.858285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfjwj\" (UniqueName: \"kubernetes.io/projected/7fb73b2d-5404-4abe-8b57-92a14a48b7ec-kube-api-access-lfjwj\") pod \"nmstate-webhook-5f6d4c5ccb-4nm7p\" (UID: \"7fb73b2d-5404-4abe-8b57-92a14a48b7ec\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.908155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5gq\" (UniqueName: \"kubernetes.io/projected/760538af-8a9b-4c70-9743-0ebd1799e6f4-kube-api-access-jr5gq\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.908231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/760538af-8a9b-4c70-9743-0ebd1799e6f4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.908305 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: E1203 23:54:44.908436 4764 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 03 23:54:44 crc kubenswrapper[4764]: E1203 23:54:44.908500 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert podName:760538af-8a9b-4c70-9743-0ebd1799e6f4 nodeName:}" failed. No retries permitted until 2025-12-03 23:54:45.408477028 +0000 UTC m=+821.169801439 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-phnlh" (UID: "760538af-8a9b-4c70-9743-0ebd1799e6f4") : secret "plugin-serving-cert" not found Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.909786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/760538af-8a9b-4c70-9743-0ebd1799e6f4-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.947259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5gq\" (UniqueName: \"kubernetes.io/projected/760538af-8a9b-4c70-9743-0ebd1799e6f4-kube-api-access-jr5gq\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.977556 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.993811 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d89f58785-l8x7r"] Dec 03 23:54:44 crc kubenswrapper[4764]: I1203 23:54:44.994443 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:44.996842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.009523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.009906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-oauth-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.009953 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-trusted-ca-bundle\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.010039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.010069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-service-ca\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.010093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.010115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqdkp\" (UniqueName: \"kubernetes.io/projected/5256b0f1-cc7f-4f25-a49c-90fe434d3541-kube-api-access-dqdkp\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.010168 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-oauth-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.015987 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d89f58785-l8x7r"] Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-trusted-ca-bundle\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-oauth-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-service-ca\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqdkp\" (UniqueName: \"kubernetes.io/projected/5256b0f1-cc7f-4f25-a49c-90fe434d3541-kube-api-access-dqdkp\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.111509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-oauth-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.112489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.112578 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-oauth-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.112820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-trusted-ca-bundle\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.113148 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5256b0f1-cc7f-4f25-a49c-90fe434d3541-service-ca\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.116617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-oauth-config\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.116683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5256b0f1-cc7f-4f25-a49c-90fe434d3541-console-serving-cert\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.136329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqdkp\" (UniqueName: \"kubernetes.io/projected/5256b0f1-cc7f-4f25-a49c-90fe434d3541-kube-api-access-dqdkp\") pod \"console-7d89f58785-l8x7r\" (UID: \"5256b0f1-cc7f-4f25-a49c-90fe434d3541\") " pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.145916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdwmt" event={"ID":"ef6a3342-1957-4648-9026-0c14fd0589d4","Type":"ContainerStarted","Data":"a3f73ea6614c37269329296de1680215b117ce966de8bce3ec48a9916589350a"} Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.263456 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p"] Dec 03 23:54:45 crc kubenswrapper[4764]: W1203 23:54:45.271921 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fb73b2d_5404_4abe_8b57_92a14a48b7ec.slice/crio-d42fa319410296b86d23bd9857efd6b0f8ad9d4d7a8d3c620c30880dcfc054c3 WatchSource:0}: Error finding container d42fa319410296b86d23bd9857efd6b0f8ad9d4d7a8d3c620c30880dcfc054c3: Status 404 returned error can't find the container with id d42fa319410296b86d23bd9857efd6b0f8ad9d4d7a8d3c620c30880dcfc054c3 Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.332925 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.414497 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn"] Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.414558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.419900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/760538af-8a9b-4c70-9743-0ebd1799e6f4-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-phnlh\" (UID: \"760538af-8a9b-4c70-9743-0ebd1799e6f4\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.530404 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d89f58785-l8x7r"] Dec 03 23:54:45 crc kubenswrapper[4764]: W1203 23:54:45.536678 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5256b0f1_cc7f_4f25_a49c_90fe434d3541.slice/crio-e7e7c61e6b1d4b0b25c359d815bcd737273e47ab531c7f27f75acfd9bbe4e458 WatchSource:0}: Error finding container e7e7c61e6b1d4b0b25c359d815bcd737273e47ab531c7f27f75acfd9bbe4e458: Status 404 returned error can't find the container with id e7e7c61e6b1d4b0b25c359d815bcd737273e47ab531c7f27f75acfd9bbe4e458 Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.696781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" Dec 03 23:54:45 crc kubenswrapper[4764]: I1203 23:54:45.935531 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh"] Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.155936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" event={"ID":"e5bd5023-35b9-4646-b974-063e5f048d1b","Type":"ContainerStarted","Data":"21bcb7f8f000479fb4fd12938aab9a9de31552c14e5fe0e24ac531de9b5dd2ce"} Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.157977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" event={"ID":"760538af-8a9b-4c70-9743-0ebd1799e6f4","Type":"ContainerStarted","Data":"35a61c6113728e9f9a27a7c330f36aa643dc3f061e67c0fdb3e8597dc7b02b27"} Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.160850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d89f58785-l8x7r" event={"ID":"5256b0f1-cc7f-4f25-a49c-90fe434d3541","Type":"ContainerStarted","Data":"90da0a5d40c4eb8106d620141b0528fff5d8721d24cc7ae86d4c0f3e63f9ca53"} Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.160900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d89f58785-l8x7r" event={"ID":"5256b0f1-cc7f-4f25-a49c-90fe434d3541","Type":"ContainerStarted","Data":"e7e7c61e6b1d4b0b25c359d815bcd737273e47ab531c7f27f75acfd9bbe4e458"} Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.164397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" event={"ID":"7fb73b2d-5404-4abe-8b57-92a14a48b7ec","Type":"ContainerStarted","Data":"d42fa319410296b86d23bd9857efd6b0f8ad9d4d7a8d3c620c30880dcfc054c3"} Dec 03 23:54:46 crc kubenswrapper[4764]: I1203 23:54:46.190497 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d89f58785-l8x7r" podStartSLOduration=2.190474946 podStartE2EDuration="2.190474946s" podCreationTimestamp="2025-12-03 23:54:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:54:46.186495638 +0000 UTC m=+821.947820059" watchObservedRunningTime="2025-12-03 23:54:46.190474946 +0000 UTC m=+821.951799367" Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.178067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" event={"ID":"7fb73b2d-5404-4abe-8b57-92a14a48b7ec","Type":"ContainerStarted","Data":"08c023bad9aa0975fdb0a0a57584cc0ba2721771ae23a1b208e9dbc87aad6b08"} Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.179232 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.180264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-rdwmt" event={"ID":"ef6a3342-1957-4648-9026-0c14fd0589d4","Type":"ContainerStarted","Data":"3be5affab9054652b62100a73e489e1d3fc3d8fc4ae6d87a6b900c00db66d9d0"} Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.180830 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.183129 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" event={"ID":"e5bd5023-35b9-4646-b974-063e5f048d1b","Type":"ContainerStarted","Data":"217cb6cb51b9535a35cff31d5627596bef0d8c13180bfa369af65c0a4fb5419a"} Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.229269 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" podStartSLOduration=1.8836947469999998 podStartE2EDuration="4.229247493s" podCreationTimestamp="2025-12-03 23:54:44 +0000 UTC" firstStartedPulling="2025-12-03 23:54:45.274811997 +0000 UTC m=+821.036136408" lastFinishedPulling="2025-12-03 23:54:47.620364723 +0000 UTC m=+823.381689154" observedRunningTime="2025-12-03 23:54:48.224835534 +0000 UTC m=+823.986159955" watchObservedRunningTime="2025-12-03 23:54:48.229247493 +0000 UTC m=+823.990571904" Dec 03 23:54:48 crc kubenswrapper[4764]: I1203 23:54:48.240432 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-rdwmt" podStartSLOduration=1.706309714 podStartE2EDuration="4.240409667s" podCreationTimestamp="2025-12-03 23:54:44 +0000 UTC" firstStartedPulling="2025-12-03 23:54:45.063189354 +0000 UTC m=+820.824513765" lastFinishedPulling="2025-12-03 23:54:47.597289287 +0000 UTC m=+823.358613718" observedRunningTime="2025-12-03 23:54:48.235171378 +0000 UTC m=+823.996495789" watchObservedRunningTime="2025-12-03 23:54:48.240409667 +0000 UTC m=+824.001734088" Dec 03 23:54:49 crc kubenswrapper[4764]: I1203 23:54:49.192613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" event={"ID":"760538af-8a9b-4c70-9743-0ebd1799e6f4","Type":"ContainerStarted","Data":"e4778d808b63317630c2cc595dfdfa7556d87c74f42b5bd01e80084d21b9de87"} Dec 03 23:54:49 crc kubenswrapper[4764]: I1203 23:54:49.212252 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-phnlh" podStartSLOduration=2.53904739 podStartE2EDuration="5.212235494s" podCreationTimestamp="2025-12-03 23:54:44 +0000 UTC" firstStartedPulling="2025-12-03 23:54:45.946335955 +0000 UTC m=+821.707660376" lastFinishedPulling="2025-12-03 23:54:48.619524069 +0000 UTC m=+824.380848480" observedRunningTime="2025-12-03 23:54:49.208747468 +0000 UTC m=+824.970071929" watchObservedRunningTime="2025-12-03 23:54:49.212235494 +0000 UTC m=+824.973559915" Dec 03 23:54:50 crc kubenswrapper[4764]: I1203 23:54:50.203711 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" event={"ID":"e5bd5023-35b9-4646-b974-063e5f048d1b","Type":"ContainerStarted","Data":"7389632c9ea97dd7c3b95398fd564c1ec64f55e145275be4d75cb4ebf5edf3e4"} Dec 03 23:54:50 crc kubenswrapper[4764]: I1203 23:54:50.231755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-z7lsn" podStartSLOduration=2.115360493 podStartE2EDuration="6.231673379s" podCreationTimestamp="2025-12-03 23:54:44 +0000 UTC" firstStartedPulling="2025-12-03 23:54:45.450331014 +0000 UTC m=+821.211655425" lastFinishedPulling="2025-12-03 23:54:49.5666439 +0000 UTC m=+825.327968311" observedRunningTime="2025-12-03 23:54:50.229600568 +0000 UTC m=+825.990925019" watchObservedRunningTime="2025-12-03 23:54:50.231673379 +0000 UTC m=+825.992997830" Dec 03 23:54:50 crc kubenswrapper[4764]: I1203 23:54:50.869048 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:54:50 crc kubenswrapper[4764]: I1203 23:54:50.869140 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:54:55 crc kubenswrapper[4764]: I1203 23:54:55.052318 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-rdwmt" Dec 03 23:54:55 crc kubenswrapper[4764]: I1203 23:54:55.333466 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:55 crc kubenswrapper[4764]: I1203 23:54:55.333525 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:55 crc kubenswrapper[4764]: I1203 23:54:55.341306 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:56 crc kubenswrapper[4764]: I1203 23:54:56.263004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d89f58785-l8x7r" Dec 03 23:54:56 crc kubenswrapper[4764]: I1203 23:54:56.328807 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:55:05 crc kubenswrapper[4764]: I1203 23:55:05.005953 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-4nm7p" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.358656 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp"] Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.360212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.362725 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.378182 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp"] Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.523425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.523504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.523537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xszp\" (UniqueName: \"kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.625765 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.625846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xszp\" (UniqueName: \"kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.625885 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.627027 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.627022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.660851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xszp\" (UniqueName: \"kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.728588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:18 crc kubenswrapper[4764]: I1203 23:55:18.962383 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp"] Dec 03 23:55:18 crc kubenswrapper[4764]: W1203 23:55:18.971932 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf82a6a_996e_42ed_b8e3_3ec8f6380323.slice/crio-3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862 WatchSource:0}: Error finding container 3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862: Status 404 returned error can't find the container with id 3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862 Dec 03 23:55:19 crc kubenswrapper[4764]: I1203 23:55:19.416493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerStarted","Data":"74c4b6458a17c98d754d8f3a87903291779e11052062edfa50950dc4591ef5ce"} Dec 03 23:55:19 crc kubenswrapper[4764]: I1203 23:55:19.416559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerStarted","Data":"3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862"} Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.427213 4764 generic.go:334] "Generic (PLEG): container finished" podID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerID="74c4b6458a17c98d754d8f3a87903291779e11052062edfa50950dc4591ef5ce" exitCode=0 Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.427313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerDied","Data":"74c4b6458a17c98d754d8f3a87903291779e11052062edfa50950dc4591ef5ce"} Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.869880 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.869979 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.870046 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.870790 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:55:20 crc kubenswrapper[4764]: I1203 23:55:20.870897 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b" gracePeriod=600 Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.373826 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-l7xdw" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" containerID="cri-o://b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646" gracePeriod=15 Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.436559 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b" exitCode=0 Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.436615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b"} Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.436658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9"} Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.436698 4764 scope.go:117] "RemoveContainer" containerID="7ab2f4a31bccf115974b6283ee0f0675d1b86be8563605d25ffd2a3fbbe3cbda" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.765150 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-l7xdw_d26ed3c8-0bba-40a7-a18a-e8718b336dcc/console/0.log" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.765535 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.874678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.874832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.874895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll5hm\" (UniqueName: \"kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.874932 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.875002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.875038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.875074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle\") pod \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\" (UID: \"d26ed3c8-0bba-40a7-a18a-e8718b336dcc\") " Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.875965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.876011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca" (OuterVolumeSpecName: "service-ca") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.876033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config" (OuterVolumeSpecName: "console-config") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.876187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.881162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.881747 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm" (OuterVolumeSpecName: "kube-api-access-ll5hm") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "kube-api-access-ll5hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.881917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d26ed3c8-0bba-40a7-a18a-e8718b336dcc" (UID: "d26ed3c8-0bba-40a7-a18a-e8718b336dcc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977408 4764 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977521 4764 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977587 4764 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977701 4764 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977800 4764 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977884 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll5hm\" (UniqueName: \"kubernetes.io/projected/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-kube-api-access-ll5hm\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:21 crc kubenswrapper[4764]: I1203 23:55:21.977914 4764 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d26ed3c8-0bba-40a7-a18a-e8718b336dcc-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.444905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerStarted","Data":"695e2f29a8ad7897f37ddfc61012c3537894ca93f8ef45b7b8678417dac46163"} Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451299 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-l7xdw_d26ed3c8-0bba-40a7-a18a-e8718b336dcc/console/0.log" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451353 4764 generic.go:334] "Generic (PLEG): container finished" podID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerID="b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646" exitCode=2 Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451404 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l7xdw" event={"ID":"d26ed3c8-0bba-40a7-a18a-e8718b336dcc","Type":"ContainerDied","Data":"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646"} Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-l7xdw" event={"ID":"d26ed3c8-0bba-40a7-a18a-e8718b336dcc","Type":"ContainerDied","Data":"915e11b7b27fbb8e0b530162730f310121c20f41c833492cc0a94f37c6346c27"} Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451459 4764 scope.go:117] "RemoveContainer" containerID="b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.451582 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-l7xdw" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.548157 4764 scope.go:117] "RemoveContainer" containerID="b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646" Dec 03 23:55:22 crc kubenswrapper[4764]: E1203 23:55:22.549169 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646\": container with ID starting with b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646 not found: ID does not exist" containerID="b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.549296 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646"} err="failed to get container status \"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646\": rpc error: code = NotFound desc = could not find container \"b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646\": container with ID starting with b99f9d4ad0dfb1eadcab0a7c5fec3cb4781e4a773de15c21cb67ef21b0ad0646 not found: ID does not exist" Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.560390 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:55:22 crc kubenswrapper[4764]: I1203 23:55:22.560582 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-l7xdw"] Dec 03 23:55:23 crc kubenswrapper[4764]: I1203 23:55:23.458236 4764 generic.go:334] "Generic (PLEG): container finished" podID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerID="695e2f29a8ad7897f37ddfc61012c3537894ca93f8ef45b7b8678417dac46163" exitCode=0 Dec 03 23:55:23 crc kubenswrapper[4764]: I1203 23:55:23.458276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerDied","Data":"695e2f29a8ad7897f37ddfc61012c3537894ca93f8ef45b7b8678417dac46163"} Dec 03 23:55:24 crc kubenswrapper[4764]: I1203 23:55:24.469779 4764 generic.go:334] "Generic (PLEG): container finished" podID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerID="98d15ad9921d58ff6c9a1417342432525ad779b164879943d4526386565e9dc6" exitCode=0 Dec 03 23:55:24 crc kubenswrapper[4764]: I1203 23:55:24.469851 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerDied","Data":"98d15ad9921d58ff6c9a1417342432525ad779b164879943d4526386565e9dc6"} Dec 03 23:55:24 crc kubenswrapper[4764]: I1203 23:55:24.557679 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" path="/var/lib/kubelet/pods/d26ed3c8-0bba-40a7-a18a-e8718b336dcc/volumes" Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.788316 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.932133 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util\") pod \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.932229 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xszp\" (UniqueName: \"kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp\") pod \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.932278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle\") pod \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\" (UID: \"ddf82a6a-996e-42ed-b8e3-3ec8f6380323\") " Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.934079 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle" (OuterVolumeSpecName: "bundle") pod "ddf82a6a-996e-42ed-b8e3-3ec8f6380323" (UID: "ddf82a6a-996e-42ed-b8e3-3ec8f6380323"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.938517 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp" (OuterVolumeSpecName: "kube-api-access-2xszp") pod "ddf82a6a-996e-42ed-b8e3-3ec8f6380323" (UID: "ddf82a6a-996e-42ed-b8e3-3ec8f6380323"). InnerVolumeSpecName "kube-api-access-2xszp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:55:25 crc kubenswrapper[4764]: I1203 23:55:25.943658 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util" (OuterVolumeSpecName: "util") pod "ddf82a6a-996e-42ed-b8e3-3ec8f6380323" (UID: "ddf82a6a-996e-42ed-b8e3-3ec8f6380323"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.032972 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-util\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.033041 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xszp\" (UniqueName: \"kubernetes.io/projected/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-kube-api-access-2xszp\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.033056 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ddf82a6a-996e-42ed-b8e3-3ec8f6380323-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.486381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" event={"ID":"ddf82a6a-996e-42ed-b8e3-3ec8f6380323","Type":"ContainerDied","Data":"3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862"} Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.486707 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e14bbcbd454dbb0d18de0e91cd62420062516191447d72c8bdf886a82cfc862" Dec 03 23:55:26 crc kubenswrapper[4764]: I1203 23:55:26.486827 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.513555 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr"] Dec 03 23:55:36 crc kubenswrapper[4764]: E1203 23:55:36.514267 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="pull" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514281 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="pull" Dec 03 23:55:36 crc kubenswrapper[4764]: E1203 23:55:36.514302 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514309 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" Dec 03 23:55:36 crc kubenswrapper[4764]: E1203 23:55:36.514350 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="util" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514357 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="util" Dec 03 23:55:36 crc kubenswrapper[4764]: E1203 23:55:36.514370 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="extract" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514376 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="extract" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514458 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf82a6a-996e-42ed-b8e3-3ec8f6380323" containerName="extract" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514473 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26ed3c8-0bba-40a7-a18a-e8718b336dcc" containerName="console" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.514848 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.516479 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.516618 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.517076 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dd689" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.517477 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.525379 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.534021 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr"] Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.576946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-apiservice-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.577007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-webhook-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.577105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rwzs\" (UniqueName: \"kubernetes.io/projected/f588cdd9-3f46-470b-9c63-9de8eab25f1a-kube-api-access-5rwzs\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.678307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-apiservice-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.678364 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-webhook-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.678408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rwzs\" (UniqueName: \"kubernetes.io/projected/f588cdd9-3f46-470b-9c63-9de8eab25f1a-kube-api-access-5rwzs\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.694711 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-apiservice-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.694766 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f588cdd9-3f46-470b-9c63-9de8eab25f1a-webhook-cert\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.701656 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rwzs\" (UniqueName: \"kubernetes.io/projected/f588cdd9-3f46-470b-9c63-9de8eab25f1a-kube-api-access-5rwzs\") pod \"metallb-operator-controller-manager-5bb48b4db7-29gpr\" (UID: \"f588cdd9-3f46-470b-9c63-9de8eab25f1a\") " pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.774548 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9"] Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.775638 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.778269 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.778291 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zv79t" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.779139 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.794366 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9"] Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.832403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.881373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-apiservice-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.881452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr2gt\" (UniqueName: \"kubernetes.io/projected/cbfc441b-442f-4232-85e4-51ab089ea1d9-kube-api-access-hr2gt\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.881495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-webhook-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.982963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-webhook-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.983353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-apiservice-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:36 crc kubenswrapper[4764]: I1203 23:55:36.983399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr2gt\" (UniqueName: \"kubernetes.io/projected/cbfc441b-442f-4232-85e4-51ab089ea1d9-kube-api-access-hr2gt\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:36.994153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-webhook-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.002298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbfc441b-442f-4232-85e4-51ab089ea1d9-apiservice-cert\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.017448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr2gt\" (UniqueName: \"kubernetes.io/projected/cbfc441b-442f-4232-85e4-51ab089ea1d9-kube-api-access-hr2gt\") pod \"metallb-operator-webhook-server-7769dfdc9d-slqg9\" (UID: \"cbfc441b-442f-4232-85e4-51ab089ea1d9\") " pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.091772 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.153699 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr"] Dec 03 23:55:37 crc kubenswrapper[4764]: W1203 23:55:37.166335 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf588cdd9_3f46_470b_9c63_9de8eab25f1a.slice/crio-7b0cccf0524b68d74b2880224459b33be34045e9cf53a091529509248d8eda65 WatchSource:0}: Error finding container 7b0cccf0524b68d74b2880224459b33be34045e9cf53a091529509248d8eda65: Status 404 returned error can't find the container with id 7b0cccf0524b68d74b2880224459b33be34045e9cf53a091529509248d8eda65 Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.318961 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9"] Dec 03 23:55:37 crc kubenswrapper[4764]: W1203 23:55:37.323849 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbfc441b_442f_4232_85e4_51ab089ea1d9.slice/crio-6cbe5a1d51ce946ed8281a0d52e455590339390bfc176c0288d7e4a1cf2cce01 WatchSource:0}: Error finding container 6cbe5a1d51ce946ed8281a0d52e455590339390bfc176c0288d7e4a1cf2cce01: Status 404 returned error can't find the container with id 6cbe5a1d51ce946ed8281a0d52e455590339390bfc176c0288d7e4a1cf2cce01 Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.553511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" event={"ID":"f588cdd9-3f46-470b-9c63-9de8eab25f1a","Type":"ContainerStarted","Data":"7b0cccf0524b68d74b2880224459b33be34045e9cf53a091529509248d8eda65"} Dec 03 23:55:37 crc kubenswrapper[4764]: I1203 23:55:37.554325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" event={"ID":"cbfc441b-442f-4232-85e4-51ab089ea1d9","Type":"ContainerStarted","Data":"6cbe5a1d51ce946ed8281a0d52e455590339390bfc176c0288d7e4a1cf2cce01"} Dec 03 23:55:42 crc kubenswrapper[4764]: I1203 23:55:42.592691 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" event={"ID":"cbfc441b-442f-4232-85e4-51ab089ea1d9","Type":"ContainerStarted","Data":"ee714c4e4b090d8b4251989b99d67a928530f26e07193c860e653c437c083f00"} Dec 03 23:55:42 crc kubenswrapper[4764]: I1203 23:55:42.596783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" event={"ID":"f588cdd9-3f46-470b-9c63-9de8eab25f1a","Type":"ContainerStarted","Data":"b085a5522dbaa3e683710e78b21712458854e812283038ce464ac3d020f9cc79"} Dec 03 23:55:42 crc kubenswrapper[4764]: I1203 23:55:42.597201 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:55:42 crc kubenswrapper[4764]: I1203 23:55:42.617264 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" podStartSLOduration=2.429369152 podStartE2EDuration="6.617242089s" podCreationTimestamp="2025-12-03 23:55:36 +0000 UTC" firstStartedPulling="2025-12-03 23:55:37.327175227 +0000 UTC m=+873.088499638" lastFinishedPulling="2025-12-03 23:55:41.515048154 +0000 UTC m=+877.276372575" observedRunningTime="2025-12-03 23:55:42.616786948 +0000 UTC m=+878.378111399" watchObservedRunningTime="2025-12-03 23:55:42.617242089 +0000 UTC m=+878.378566530" Dec 03 23:55:42 crc kubenswrapper[4764]: I1203 23:55:42.643614 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" podStartSLOduration=2.296149943 podStartE2EDuration="6.643592316s" podCreationTimestamp="2025-12-03 23:55:36 +0000 UTC" firstStartedPulling="2025-12-03 23:55:37.169371975 +0000 UTC m=+872.930696386" lastFinishedPulling="2025-12-03 23:55:41.516814348 +0000 UTC m=+877.278138759" observedRunningTime="2025-12-03 23:55:42.640987982 +0000 UTC m=+878.402312423" watchObservedRunningTime="2025-12-03 23:55:42.643592316 +0000 UTC m=+878.404916767" Dec 03 23:55:43 crc kubenswrapper[4764]: I1203 23:55:43.612517 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:55:57 crc kubenswrapper[4764]: I1203 23:55:57.097558 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7769dfdc9d-slqg9" Dec 03 23:56:16 crc kubenswrapper[4764]: I1203 23:56:16.836888 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5bb48b4db7-29gpr" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.704709 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.705866 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.708443 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-gr4vt" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.708836 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.716983 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.732018 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r56k9"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.734264 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.738081 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.738106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.764781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d1003-dccf-474e-8d17-359aa1ae6d95-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.764845 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4tl8\" (UniqueName: \"kubernetes.io/projected/049d1003-dccf-474e-8d17-359aa1ae6d95-kube-api-access-z4tl8\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.789688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-t6gtd"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.790568 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.793709 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.793953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.794074 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-2fb49" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.794178 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.801869 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-nqrkc"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.802684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.805113 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.828803 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-nqrkc"] Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866586 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-sockets\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-conf\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4tl8\" (UniqueName: \"kubernetes.io/projected/049d1003-dccf-474e-8d17-359aa1ae6d95-kube-api-access-z4tl8\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-cert\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metallb-excludel2\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.866969 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjr9\" (UniqueName: \"kubernetes.io/projected/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-kube-api-access-wpjr9\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-reloader\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867089 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/56658724-6546-4715-8fa6-6997065dad38-frr-startup\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867139 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tmf5\" (UniqueName: \"kubernetes.io/projected/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-kube-api-access-7tmf5\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867182 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867207 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-metrics\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d1003-dccf-474e-8d17-359aa1ae6d95-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8v5v\" (UniqueName: \"kubernetes.io/projected/56658724-6546-4715-8fa6-6997065dad38-kube-api-access-t8v5v\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.867407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.873816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d1003-dccf-474e-8d17-359aa1ae6d95-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.890438 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4tl8\" (UniqueName: \"kubernetes.io/projected/049d1003-dccf-474e-8d17-359aa1ae6d95-kube-api-access-z4tl8\") pod \"frr-k8s-webhook-server-7fcb986d4-g975j\" (UID: \"049d1003-dccf-474e-8d17-359aa1ae6d95\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969048 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8v5v\" (UniqueName: \"kubernetes.io/projected/56658724-6546-4715-8fa6-6997065dad38-kube-api-access-t8v5v\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969122 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-sockets\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-conf\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-cert\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metallb-excludel2\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969236 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjr9\" (UniqueName: \"kubernetes.io/projected/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-kube-api-access-wpjr9\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969255 4764 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969326 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969354 4764 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969332 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs podName:6e46fa91-e07b-4b3a-97fd-e1aa7608eb87 nodeName:}" failed. No retries permitted until 2025-12-03 23:56:18.469311572 +0000 UTC m=+914.230635983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs") pod "controller-f8648f98b-nqrkc" (UID: "6e46fa91-e07b-4b3a-97fd-e1aa7608eb87") : secret "controller-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969418 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist podName:086eaaf6-7fad-4866-b78a-4123a4f6e9a1 nodeName:}" failed. No retries permitted until 2025-12-03 23:56:18.469398634 +0000 UTC m=+914.230723045 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist") pod "speaker-t6gtd" (UID: "086eaaf6-7fad-4866-b78a-4123a4f6e9a1") : secret "metallb-memberlist" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.969432 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs podName:56658724-6546-4715-8fa6-6997065dad38 nodeName:}" failed. No retries permitted until 2025-12-03 23:56:18.469425075 +0000 UTC m=+914.230749486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs") pod "frr-k8s-r56k9" (UID: "56658724-6546-4715-8fa6-6997065dad38") : secret "frr-k8s-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-reloader\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/56658724-6546-4715-8fa6-6997065dad38-frr-startup\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tmf5\" (UniqueName: \"kubernetes.io/projected/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-kube-api-access-7tmf5\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969587 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-metrics\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-conf\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.969924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-reloader\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.970028 4764 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: E1203 23:56:17.970076 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs podName:086eaaf6-7fad-4866-b78a-4123a4f6e9a1 nodeName:}" failed. No retries permitted until 2025-12-03 23:56:18.470059101 +0000 UTC m=+914.231383512 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs") pod "speaker-t6gtd" (UID: "086eaaf6-7fad-4866-b78a-4123a4f6e9a1") : secret "speaker-certs-secret" not found Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.970087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-metrics\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.970243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metallb-excludel2\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.970478 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/56658724-6546-4715-8fa6-6997065dad38-frr-sockets\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.970486 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/56658724-6546-4715-8fa6-6997065dad38-frr-startup\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.973235 4764 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 23:56:17 crc kubenswrapper[4764]: I1203 23:56:17.999422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-cert\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.002128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tmf5\" (UniqueName: \"kubernetes.io/projected/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-kube-api-access-7tmf5\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.009888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjr9\" (UniqueName: \"kubernetes.io/projected/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-kube-api-access-wpjr9\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.014435 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8v5v\" (UniqueName: \"kubernetes.io/projected/56658724-6546-4715-8fa6-6997065dad38-kube-api-access-t8v5v\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.023036 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.262808 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j"] Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.476123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.476188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.476232 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.476266 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:18 crc kubenswrapper[4764]: E1203 23:56:18.476413 4764 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 23:56:18 crc kubenswrapper[4764]: E1203 23:56:18.476470 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist podName:086eaaf6-7fad-4866-b78a-4123a4f6e9a1 nodeName:}" failed. No retries permitted until 2025-12-03 23:56:19.476451886 +0000 UTC m=+915.237776297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist") pod "speaker-t6gtd" (UID: "086eaaf6-7fad-4866-b78a-4123a4f6e9a1") : secret "metallb-memberlist" not found Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.480333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56658724-6546-4715-8fa6-6997065dad38-metrics-certs\") pod \"frr-k8s-r56k9\" (UID: \"56658724-6546-4715-8fa6-6997065dad38\") " pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.480498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-metrics-certs\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.481845 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e46fa91-e07b-4b3a-97fd-e1aa7608eb87-metrics-certs\") pod \"controller-f8648f98b-nqrkc\" (UID: \"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87\") " pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.656085 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.714120 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.832548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" event={"ID":"049d1003-dccf-474e-8d17-359aa1ae6d95","Type":"ContainerStarted","Data":"e4f669a29c65183d6da5f5ca2f9508e5a6e65d61d0bd5328350ae3440e6a7203"} Dec 03 23:56:18 crc kubenswrapper[4764]: I1203 23:56:18.982047 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-nqrkc"] Dec 03 23:56:18 crc kubenswrapper[4764]: W1203 23:56:18.986730 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e46fa91_e07b_4b3a_97fd_e1aa7608eb87.slice/crio-09b822ecd23b71aa5e770bc0684da4f24274f60f998ede9fba787aac5f33dba8 WatchSource:0}: Error finding container 09b822ecd23b71aa5e770bc0684da4f24274f60f998ede9fba787aac5f33dba8: Status 404 returned error can't find the container with id 09b822ecd23b71aa5e770bc0684da4f24274f60f998ede9fba787aac5f33dba8 Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.490342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.498948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/086eaaf6-7fad-4866-b78a-4123a4f6e9a1-memberlist\") pod \"speaker-t6gtd\" (UID: \"086eaaf6-7fad-4866-b78a-4123a4f6e9a1\") " pod="metallb-system/speaker-t6gtd" Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.602641 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-t6gtd" Dec 03 23:56:19 crc kubenswrapper[4764]: W1203 23:56:19.621108 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod086eaaf6_7fad_4866_b78a_4123a4f6e9a1.slice/crio-666357f525fe734bf900fbfddc9faeea6efcf3d144bf4df0a0c697a7b2603543 WatchSource:0}: Error finding container 666357f525fe734bf900fbfddc9faeea6efcf3d144bf4df0a0c697a7b2603543: Status 404 returned error can't find the container with id 666357f525fe734bf900fbfddc9faeea6efcf3d144bf4df0a0c697a7b2603543 Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.843918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"a5ea7db8512065799df1467203553f5e2d84fafbe99d6a8e84f6a52a9d0e9326"} Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.848958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nqrkc" event={"ID":"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87","Type":"ContainerStarted","Data":"a4410f285c3201505718e426d14c048d13b9cf1992f7d85a0dc1d0e9c123b08c"} Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.849005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nqrkc" event={"ID":"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87","Type":"ContainerStarted","Data":"e63c2bb709550e77fc4290ba9bec39fa5d2ad25388f11e9f412269096095e4d4"} Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.849015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-nqrkc" event={"ID":"6e46fa91-e07b-4b3a-97fd-e1aa7608eb87","Type":"ContainerStarted","Data":"09b822ecd23b71aa5e770bc0684da4f24274f60f998ede9fba787aac5f33dba8"} Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.849070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.851777 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6gtd" event={"ID":"086eaaf6-7fad-4866-b78a-4123a4f6e9a1","Type":"ContainerStarted","Data":"666357f525fe734bf900fbfddc9faeea6efcf3d144bf4df0a0c697a7b2603543"} Dec 03 23:56:19 crc kubenswrapper[4764]: I1203 23:56:19.929944 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-nqrkc" podStartSLOduration=2.929928571 podStartE2EDuration="2.929928571s" podCreationTimestamp="2025-12-03 23:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:56:19.927331057 +0000 UTC m=+915.688655468" watchObservedRunningTime="2025-12-03 23:56:19.929928571 +0000 UTC m=+915.691252982" Dec 03 23:56:20 crc kubenswrapper[4764]: I1203 23:56:20.875414 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6gtd" event={"ID":"086eaaf6-7fad-4866-b78a-4123a4f6e9a1","Type":"ContainerStarted","Data":"4d29574ac1803596abed632000b4dafb59f0dcedbd9f512f234855f4e71da711"} Dec 03 23:56:20 crc kubenswrapper[4764]: I1203 23:56:20.875472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-t6gtd" event={"ID":"086eaaf6-7fad-4866-b78a-4123a4f6e9a1","Type":"ContainerStarted","Data":"4f78cdac8f2b00c50f24369434405dc65d1ce548febc283d482ef3667b6704b7"} Dec 03 23:56:20 crc kubenswrapper[4764]: I1203 23:56:20.875835 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-t6gtd" Dec 03 23:56:20 crc kubenswrapper[4764]: I1203 23:56:20.903155 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-t6gtd" podStartSLOduration=3.90313801 podStartE2EDuration="3.90313801s" podCreationTimestamp="2025-12-03 23:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:56:20.901181252 +0000 UTC m=+916.662505663" watchObservedRunningTime="2025-12-03 23:56:20.90313801 +0000 UTC m=+916.664462421" Dec 03 23:56:25 crc kubenswrapper[4764]: I1203 23:56:25.924248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" event={"ID":"049d1003-dccf-474e-8d17-359aa1ae6d95","Type":"ContainerStarted","Data":"6c0613f4379a28832b7ee8dff8cad4911f6f95dd97f5aaf359ec64b21b573b3d"} Dec 03 23:56:25 crc kubenswrapper[4764]: I1203 23:56:25.924904 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:25 crc kubenswrapper[4764]: I1203 23:56:25.926366 4764 generic.go:334] "Generic (PLEG): container finished" podID="56658724-6546-4715-8fa6-6997065dad38" containerID="94e5d586f6193f1b1e9797f68b7834964280303c511aff4052822e2e8a06759c" exitCode=0 Dec 03 23:56:25 crc kubenswrapper[4764]: I1203 23:56:25.926408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerDied","Data":"94e5d586f6193f1b1e9797f68b7834964280303c511aff4052822e2e8a06759c"} Dec 03 23:56:25 crc kubenswrapper[4764]: I1203 23:56:25.940701 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" podStartSLOduration=2.358284864 podStartE2EDuration="8.940669226s" podCreationTimestamp="2025-12-03 23:56:17 +0000 UTC" firstStartedPulling="2025-12-03 23:56:18.271132798 +0000 UTC m=+914.032457209" lastFinishedPulling="2025-12-03 23:56:24.85351716 +0000 UTC m=+920.614841571" observedRunningTime="2025-12-03 23:56:25.937346124 +0000 UTC m=+921.698670535" watchObservedRunningTime="2025-12-03 23:56:25.940669226 +0000 UTC m=+921.701993667" Dec 03 23:56:26 crc kubenswrapper[4764]: I1203 23:56:26.954289 4764 generic.go:334] "Generic (PLEG): container finished" podID="56658724-6546-4715-8fa6-6997065dad38" containerID="1df90df63f1218a5737d546656c509cf7e809835b055ce30626e3f01f878601d" exitCode=0 Dec 03 23:56:26 crc kubenswrapper[4764]: I1203 23:56:26.954379 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerDied","Data":"1df90df63f1218a5737d546656c509cf7e809835b055ce30626e3f01f878601d"} Dec 03 23:56:27 crc kubenswrapper[4764]: I1203 23:56:27.977602 4764 generic.go:334] "Generic (PLEG): container finished" podID="56658724-6546-4715-8fa6-6997065dad38" containerID="059ffc6f6c4046a8475d00a6b0b835e8e1ed52e678bd590d02fc9873ccb56807" exitCode=0 Dec 03 23:56:27 crc kubenswrapper[4764]: I1203 23:56:27.977676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerDied","Data":"059ffc6f6c4046a8475d00a6b0b835e8e1ed52e678bd590d02fc9873ccb56807"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"6e01700922fc9bff1cf8ba5058927755bd873eead8828220795eab549c6d2385"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"19b733840d6c60a9a7e1547f243b923323e5fb8be64ea463820c4e22c6145920"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"b828c89111e529050348e76948507adc8a4d2338046c05808cd844f072a766d0"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993375 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"0a1d1dea7dd0f9c7eba55e33063e37fd1b15e502e630a1c09eb6db2af8363d6c"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993386 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"04fb7fff00efff4ec2909e781b97538a66e445f7275d29b3122fb556be7305f7"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.993397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r56k9" event={"ID":"56658724-6546-4715-8fa6-6997065dad38","Type":"ContainerStarted","Data":"2e5572cfe702d6ade1dd2b6583499be833670c4d48251ca183a054e8e3d201e2"} Dec 03 23:56:28 crc kubenswrapper[4764]: I1203 23:56:28.994425 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:29 crc kubenswrapper[4764]: I1203 23:56:29.607430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-t6gtd" Dec 03 23:56:29 crc kubenswrapper[4764]: I1203 23:56:29.628171 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r56k9" podStartSLOduration=6.595231926 podStartE2EDuration="12.628152716s" podCreationTimestamp="2025-12-03 23:56:17 +0000 UTC" firstStartedPulling="2025-12-03 23:56:18.826922746 +0000 UTC m=+914.588247197" lastFinishedPulling="2025-12-03 23:56:24.859843576 +0000 UTC m=+920.621167987" observedRunningTime="2025-12-03 23:56:29.027506328 +0000 UTC m=+924.788830749" watchObservedRunningTime="2025-12-03 23:56:29.628152716 +0000 UTC m=+925.389477127" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.176067 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8"] Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.179023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.187662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8"] Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.196894 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.274330 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.274391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7762\" (UniqueName: \"kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.274507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.375609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7762\" (UniqueName: \"kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.375690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.375769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.376314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.376421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.405106 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7762\" (UniqueName: \"kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:31 crc kubenswrapper[4764]: I1203 23:56:31.496750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:32 crc kubenswrapper[4764]: W1203 23:56:32.021631 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1930095_9d98_42ce_bc7e_46ac75742d43.slice/crio-479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089 WatchSource:0}: Error finding container 479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089: Status 404 returned error can't find the container with id 479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089 Dec 03 23:56:32 crc kubenswrapper[4764]: I1203 23:56:32.024047 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8"] Dec 03 23:56:33 crc kubenswrapper[4764]: I1203 23:56:33.019936 4764 generic.go:334] "Generic (PLEG): container finished" podID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerID="77d33d1c8f4f7ed11feb5cfcfdad5885adfa30b49c22d153bf13c27946f24d89" exitCode=0 Dec 03 23:56:33 crc kubenswrapper[4764]: I1203 23:56:33.020018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" event={"ID":"b1930095-9d98-42ce-bc7e-46ac75742d43","Type":"ContainerDied","Data":"77d33d1c8f4f7ed11feb5cfcfdad5885adfa30b49c22d153bf13c27946f24d89"} Dec 03 23:56:33 crc kubenswrapper[4764]: I1203 23:56:33.020259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" event={"ID":"b1930095-9d98-42ce-bc7e-46ac75742d43","Type":"ContainerStarted","Data":"479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089"} Dec 03 23:56:33 crc kubenswrapper[4764]: I1203 23:56:33.656401 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:33 crc kubenswrapper[4764]: I1203 23:56:33.690910 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:38 crc kubenswrapper[4764]: I1203 23:56:38.029310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-g975j" Dec 03 23:56:38 crc kubenswrapper[4764]: I1203 23:56:38.055347 4764 generic.go:334] "Generic (PLEG): container finished" podID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerID="86b81297176e05f0c27c98496f2b86a3965e22c4bbe4b038fa9fb9924911c867" exitCode=0 Dec 03 23:56:38 crc kubenswrapper[4764]: I1203 23:56:38.055400 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" event={"ID":"b1930095-9d98-42ce-bc7e-46ac75742d43","Type":"ContainerDied","Data":"86b81297176e05f0c27c98496f2b86a3965e22c4bbe4b038fa9fb9924911c867"} Dec 03 23:56:38 crc kubenswrapper[4764]: I1203 23:56:38.659912 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r56k9" Dec 03 23:56:38 crc kubenswrapper[4764]: I1203 23:56:38.719899 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-nqrkc" Dec 03 23:56:39 crc kubenswrapper[4764]: I1203 23:56:39.066103 4764 generic.go:334] "Generic (PLEG): container finished" podID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerID="a749200ad0775f43634d71f842ec380ca1aa11ba34c507df53c00756dee9919a" exitCode=0 Dec 03 23:56:39 crc kubenswrapper[4764]: I1203 23:56:39.066160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" event={"ID":"b1930095-9d98-42ce-bc7e-46ac75742d43","Type":"ContainerDied","Data":"a749200ad0775f43634d71f842ec380ca1aa11ba34c507df53c00756dee9919a"} Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.318493 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.416797 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7762\" (UniqueName: \"kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762\") pod \"b1930095-9d98-42ce-bc7e-46ac75742d43\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.416941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle\") pod \"b1930095-9d98-42ce-bc7e-46ac75742d43\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.417001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util\") pod \"b1930095-9d98-42ce-bc7e-46ac75742d43\" (UID: \"b1930095-9d98-42ce-bc7e-46ac75742d43\") " Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.418379 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle" (OuterVolumeSpecName: "bundle") pod "b1930095-9d98-42ce-bc7e-46ac75742d43" (UID: "b1930095-9d98-42ce-bc7e-46ac75742d43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.421754 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762" (OuterVolumeSpecName: "kube-api-access-j7762") pod "b1930095-9d98-42ce-bc7e-46ac75742d43" (UID: "b1930095-9d98-42ce-bc7e-46ac75742d43"). InnerVolumeSpecName "kube-api-access-j7762". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.429302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util" (OuterVolumeSpecName: "util") pod "b1930095-9d98-42ce-bc7e-46ac75742d43" (UID: "b1930095-9d98-42ce-bc7e-46ac75742d43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.518970 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-util\") on node \"crc\" DevicePath \"\"" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.519008 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7762\" (UniqueName: \"kubernetes.io/projected/b1930095-9d98-42ce-bc7e-46ac75742d43-kube-api-access-j7762\") on node \"crc\" DevicePath \"\"" Dec 03 23:56:40 crc kubenswrapper[4764]: I1203 23:56:40.519022 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b1930095-9d98-42ce-bc7e-46ac75742d43-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:56:41 crc kubenswrapper[4764]: I1203 23:56:41.082158 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" event={"ID":"b1930095-9d98-42ce-bc7e-46ac75742d43","Type":"ContainerDied","Data":"479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089"} Dec 03 23:56:41 crc kubenswrapper[4764]: I1203 23:56:41.082448 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="479c273811b8093fb48cd4f9a11d11565771f1b3e6a74b5c23265f48a0a6f089" Dec 03 23:56:41 crc kubenswrapper[4764]: I1203 23:56:41.082230 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.604435 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl"] Dec 03 23:56:44 crc kubenswrapper[4764]: E1203 23:56:44.605047 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="util" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.605066 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="util" Dec 03 23:56:44 crc kubenswrapper[4764]: E1203 23:56:44.605089 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="pull" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.605097 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="pull" Dec 03 23:56:44 crc kubenswrapper[4764]: E1203 23:56:44.605121 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="extract" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.605131 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="extract" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.605838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1930095-9d98-42ce-bc7e-46ac75742d43" containerName="extract" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.606539 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.612422 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.612532 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.612440 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-zsggf" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.616211 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl"] Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.671815 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72cf30-fefb-496d-96d4-906f7514645c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.671919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75b42\" (UniqueName: \"kubernetes.io/projected/cd72cf30-fefb-496d-96d4-906f7514645c-kube-api-access-75b42\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.773248 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72cf30-fefb-496d-96d4-906f7514645c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.773308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75b42\" (UniqueName: \"kubernetes.io/projected/cd72cf30-fefb-496d-96d4-906f7514645c-kube-api-access-75b42\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.774098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72cf30-fefb-496d-96d4-906f7514645c-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.795853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75b42\" (UniqueName: \"kubernetes.io/projected/cd72cf30-fefb-496d-96d4-906f7514645c-kube-api-access-75b42\") pod \"cert-manager-operator-controller-manager-64cf6dff88-fc8hl\" (UID: \"cd72cf30-fefb-496d-96d4-906f7514645c\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:44 crc kubenswrapper[4764]: I1203 23:56:44.949228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" Dec 03 23:56:45 crc kubenswrapper[4764]: I1203 23:56:45.296403 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl"] Dec 03 23:56:45 crc kubenswrapper[4764]: W1203 23:56:45.311575 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd72cf30_fefb_496d_96d4_906f7514645c.slice/crio-5c7f441833d11992eba6a4c8cb13ebac6c274bd1bf423c479e639da73b1b5629 WatchSource:0}: Error finding container 5c7f441833d11992eba6a4c8cb13ebac6c274bd1bf423c479e639da73b1b5629: Status 404 returned error can't find the container with id 5c7f441833d11992eba6a4c8cb13ebac6c274bd1bf423c479e639da73b1b5629 Dec 03 23:56:46 crc kubenswrapper[4764]: I1203 23:56:46.117546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" event={"ID":"cd72cf30-fefb-496d-96d4-906f7514645c","Type":"ContainerStarted","Data":"5c7f441833d11992eba6a4c8cb13ebac6c274bd1bf423c479e639da73b1b5629"} Dec 03 23:56:52 crc kubenswrapper[4764]: I1203 23:56:52.171695 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" event={"ID":"cd72cf30-fefb-496d-96d4-906f7514645c","Type":"ContainerStarted","Data":"2d712be55ecffc80c902d855ff17b65419beab402d63ef0124785cf68d6ffb7a"} Dec 03 23:56:52 crc kubenswrapper[4764]: I1203 23:56:52.191254 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-fc8hl" podStartSLOduration=1.6473922399999998 podStartE2EDuration="8.191237896s" podCreationTimestamp="2025-12-03 23:56:44 +0000 UTC" firstStartedPulling="2025-12-03 23:56:45.319822522 +0000 UTC m=+941.081146933" lastFinishedPulling="2025-12-03 23:56:51.863668188 +0000 UTC m=+947.624992589" observedRunningTime="2025-12-03 23:56:52.189586005 +0000 UTC m=+947.950910416" watchObservedRunningTime="2025-12-03 23:56:52.191237896 +0000 UTC m=+947.952562317" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.388704 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g74lr"] Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.389888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.391798 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-4s2bx" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.392042 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.392629 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.399812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g74lr"] Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.543749 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjhwb\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-kube-api-access-sjhwb\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.543824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.645524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjhwb\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-kube-api-access-sjhwb\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.645615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.667831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjhwb\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-kube-api-access-sjhwb\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.682512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5363aea9-b6c1-4b4b-9446-30b2bba729cc-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-g74lr\" (UID: \"5363aea9-b6c1-4b4b-9446-30b2bba729cc\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:54 crc kubenswrapper[4764]: I1203 23:56:54.748843 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:56:55 crc kubenswrapper[4764]: I1203 23:56:55.214941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-g74lr"] Dec 03 23:56:56 crc kubenswrapper[4764]: I1203 23:56:56.200739 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" event={"ID":"5363aea9-b6c1-4b4b-9446-30b2bba729cc","Type":"ContainerStarted","Data":"97b596f5fae26d2b2c961ceb937fb9c3a624e7dd72cec7899a6f4c2637b152c5"} Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.071083 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk"] Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.072507 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.074662 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-h6xz8" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.093496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk"] Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.200547 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.200610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp78d\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-kube-api-access-sp78d\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.301842 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp78d\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-kube-api-access-sp78d\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.301949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.325635 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.326476 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp78d\" (UniqueName: \"kubernetes.io/projected/5aa91780-e991-4b90-abba-bcdeba0898d4-kube-api-access-sp78d\") pod \"cert-manager-cainjector-855d9ccff4-vjvpk\" (UID: \"5aa91780-e991-4b90-abba-bcdeba0898d4\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:56:58 crc kubenswrapper[4764]: I1203 23:56:58.397279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" Dec 03 23:57:02 crc kubenswrapper[4764]: I1203 23:57:02.698765 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk"] Dec 03 23:57:02 crc kubenswrapper[4764]: W1203 23:57:02.718891 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5aa91780_e991_4b90_abba_bcdeba0898d4.slice/crio-a9dc9179aa7e8656aabc89a909f286dab50d1114c41a3d8fc08a0d8b6a97eed3 WatchSource:0}: Error finding container a9dc9179aa7e8656aabc89a909f286dab50d1114c41a3d8fc08a0d8b6a97eed3: Status 404 returned error can't find the container with id a9dc9179aa7e8656aabc89a909f286dab50d1114c41a3d8fc08a0d8b6a97eed3 Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.258908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" event={"ID":"5363aea9-b6c1-4b4b-9446-30b2bba729cc","Type":"ContainerStarted","Data":"85010a96b2637644836f00817850510307c5ed65879f2926737183a54e96d8c2"} Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.259059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.260131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" event={"ID":"5aa91780-e991-4b90-abba-bcdeba0898d4","Type":"ContainerStarted","Data":"cffa4d4bc26bdd7e04fae1e3916e2a5ad0c4981e5c2b0094a7332a16a0f155ee"} Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.260167 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" event={"ID":"5aa91780-e991-4b90-abba-bcdeba0898d4","Type":"ContainerStarted","Data":"a9dc9179aa7e8656aabc89a909f286dab50d1114c41a3d8fc08a0d8b6a97eed3"} Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.275587 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" podStartSLOduration=1.9040824170000001 podStartE2EDuration="9.275570682s" podCreationTimestamp="2025-12-03 23:56:54 +0000 UTC" firstStartedPulling="2025-12-03 23:56:55.223794015 +0000 UTC m=+950.985118426" lastFinishedPulling="2025-12-03 23:57:02.59528228 +0000 UTC m=+958.356606691" observedRunningTime="2025-12-03 23:57:03.274133337 +0000 UTC m=+959.035457748" watchObservedRunningTime="2025-12-03 23:57:03.275570682 +0000 UTC m=+959.036895103" Dec 03 23:57:03 crc kubenswrapper[4764]: I1203 23:57:03.291636 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-vjvpk" podStartSLOduration=5.291614346 podStartE2EDuration="5.291614346s" podCreationTimestamp="2025-12-03 23:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:57:03.290110109 +0000 UTC m=+959.051434520" watchObservedRunningTime="2025-12-03 23:57:03.291614346 +0000 UTC m=+959.052938767" Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.842774 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-p8rfr"] Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.844512 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.851851 4764 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-bxw6d" Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.852588 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-p8rfr"] Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.914668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-bound-sa-token\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:05 crc kubenswrapper[4764]: I1203 23:57:05.914745 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzvt\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-kube-api-access-zzzvt\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.016310 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-bound-sa-token\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.016367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzvt\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-kube-api-access-zzzvt\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.041822 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-bound-sa-token\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.042359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzvt\" (UniqueName: \"kubernetes.io/projected/ba1a186a-006d-4fa1-a90c-48d4220661ba-kube-api-access-zzzvt\") pod \"cert-manager-86cb77c54b-p8rfr\" (UID: \"ba1a186a-006d-4fa1-a90c-48d4220661ba\") " pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.166288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-p8rfr" Dec 03 23:57:06 crc kubenswrapper[4764]: I1203 23:57:06.629472 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-p8rfr"] Dec 03 23:57:07 crc kubenswrapper[4764]: I1203 23:57:07.287027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-p8rfr" event={"ID":"ba1a186a-006d-4fa1-a90c-48d4220661ba","Type":"ContainerStarted","Data":"64a14519973a84f1ac59b7633156004a584d32bdd8c84a457b36be7b734019cb"} Dec 03 23:57:07 crc kubenswrapper[4764]: I1203 23:57:07.287549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-p8rfr" event={"ID":"ba1a186a-006d-4fa1-a90c-48d4220661ba","Type":"ContainerStarted","Data":"4bc80422bb7b633f4a265cf89e59943a1c137e40570214c1e4ecb36e66c448b8"} Dec 03 23:57:07 crc kubenswrapper[4764]: I1203 23:57:07.311960 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-p8rfr" podStartSLOduration=2.311940302 podStartE2EDuration="2.311940302s" podCreationTimestamp="2025-12-03 23:57:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:57:07.308753984 +0000 UTC m=+963.070078395" watchObservedRunningTime="2025-12-03 23:57:07.311940302 +0000 UTC m=+963.073264723" Dec 03 23:57:09 crc kubenswrapper[4764]: I1203 23:57:09.751912 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-g74lr" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.537288 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ph7dn"] Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.538395 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.541783 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5dtq4" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.541841 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.542292 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.565309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ph7dn"] Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.617704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxj5t\" (UniqueName: \"kubernetes.io/projected/c632a187-d000-4581-bdbb-9f0f47e448cc-kube-api-access-hxj5t\") pod \"openstack-operator-index-ph7dn\" (UID: \"c632a187-d000-4581-bdbb-9f0f47e448cc\") " pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.719543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxj5t\" (UniqueName: \"kubernetes.io/projected/c632a187-d000-4581-bdbb-9f0f47e448cc-kube-api-access-hxj5t\") pod \"openstack-operator-index-ph7dn\" (UID: \"c632a187-d000-4581-bdbb-9f0f47e448cc\") " pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.736899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxj5t\" (UniqueName: \"kubernetes.io/projected/c632a187-d000-4581-bdbb-9f0f47e448cc-kube-api-access-hxj5t\") pod \"openstack-operator-index-ph7dn\" (UID: \"c632a187-d000-4581-bdbb-9f0f47e448cc\") " pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:13 crc kubenswrapper[4764]: I1203 23:57:13.869647 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:14 crc kubenswrapper[4764]: I1203 23:57:14.332223 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ph7dn"] Dec 03 23:57:14 crc kubenswrapper[4764]: I1203 23:57:14.345256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph7dn" event={"ID":"c632a187-d000-4581-bdbb-9f0f47e448cc","Type":"ContainerStarted","Data":"01e31598697690bcd74fc90736b61c6066b7354d3abf92133f8c16780bf076aa"} Dec 03 23:57:16 crc kubenswrapper[4764]: I1203 23:57:16.364740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ph7dn" event={"ID":"c632a187-d000-4581-bdbb-9f0f47e448cc","Type":"ContainerStarted","Data":"8ce53f7c149aabf7cbec42be20d9a8e89ca24e2ec2ca274b6a447d69fcc13afd"} Dec 03 23:57:16 crc kubenswrapper[4764]: I1203 23:57:16.391792 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ph7dn" podStartSLOduration=2.387451543 podStartE2EDuration="3.391701193s" podCreationTimestamp="2025-12-03 23:57:13 +0000 UTC" firstStartedPulling="2025-12-03 23:57:14.339995911 +0000 UTC m=+970.101320332" lastFinishedPulling="2025-12-03 23:57:15.344245561 +0000 UTC m=+971.105569982" observedRunningTime="2025-12-03 23:57:16.386148867 +0000 UTC m=+972.147473328" watchObservedRunningTime="2025-12-03 23:57:16.391701193 +0000 UTC m=+972.153025645" Dec 03 23:57:23 crc kubenswrapper[4764]: I1203 23:57:23.870349 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:23 crc kubenswrapper[4764]: I1203 23:57:23.871659 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:23 crc kubenswrapper[4764]: I1203 23:57:23.921507 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:24 crc kubenswrapper[4764]: I1203 23:57:24.469830 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ph7dn" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.350824 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp"] Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.352059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.356674 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-x4pmb" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.371179 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp"] Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.376610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw8rl\" (UniqueName: \"kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.376693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.376883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.478345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.478478 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw8rl\" (UniqueName: \"kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.478518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.479549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.479561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.514824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw8rl\" (UniqueName: \"kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:31 crc kubenswrapper[4764]: I1203 23:57:31.682676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:32 crc kubenswrapper[4764]: I1203 23:57:32.016215 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp"] Dec 03 23:57:32 crc kubenswrapper[4764]: W1203 23:57:32.020307 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4fd273_0b92_483d_b8c6_5f558f260a3e.slice/crio-5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc WatchSource:0}: Error finding container 5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc: Status 404 returned error can't find the container with id 5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc Dec 03 23:57:32 crc kubenswrapper[4764]: I1203 23:57:32.493140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerStarted","Data":"5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc"} Dec 03 23:57:33 crc kubenswrapper[4764]: I1203 23:57:33.501000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerStarted","Data":"6d4817e0ba9885abdd61563946351eba76e171eb88072d38ab5506ad2f8210d2"} Dec 03 23:57:34 crc kubenswrapper[4764]: I1203 23:57:34.509450 4764 generic.go:334] "Generic (PLEG): container finished" podID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerID="6d4817e0ba9885abdd61563946351eba76e171eb88072d38ab5506ad2f8210d2" exitCode=0 Dec 03 23:57:34 crc kubenswrapper[4764]: I1203 23:57:34.509580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerDied","Data":"6d4817e0ba9885abdd61563946351eba76e171eb88072d38ab5506ad2f8210d2"} Dec 03 23:57:37 crc kubenswrapper[4764]: I1203 23:57:37.537275 4764 generic.go:334] "Generic (PLEG): container finished" podID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerID="0f9c0ecd5045c722e0befa78f6d00b0d2de3a3d2ee7847895dbcb8864faf27b2" exitCode=0 Dec 03 23:57:37 crc kubenswrapper[4764]: I1203 23:57:37.537491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerDied","Data":"0f9c0ecd5045c722e0befa78f6d00b0d2de3a3d2ee7847895dbcb8864faf27b2"} Dec 03 23:57:38 crc kubenswrapper[4764]: I1203 23:57:38.547956 4764 generic.go:334] "Generic (PLEG): container finished" podID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerID="0ce827b9f409f02646a8100b2cd415ac46dd9ead4590b2d5adf8e849501a7e15" exitCode=0 Dec 03 23:57:38 crc kubenswrapper[4764]: I1203 23:57:38.560122 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerDied","Data":"0ce827b9f409f02646a8100b2cd415ac46dd9ead4590b2d5adf8e849501a7e15"} Dec 03 23:57:39 crc kubenswrapper[4764]: I1203 23:57:39.874952 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.010217 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util\") pod \"db4fd273-0b92-483d-b8c6-5f558f260a3e\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.010344 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle\") pod \"db4fd273-0b92-483d-b8c6-5f558f260a3e\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.010487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw8rl\" (UniqueName: \"kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl\") pod \"db4fd273-0b92-483d-b8c6-5f558f260a3e\" (UID: \"db4fd273-0b92-483d-b8c6-5f558f260a3e\") " Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.011690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle" (OuterVolumeSpecName: "bundle") pod "db4fd273-0b92-483d-b8c6-5f558f260a3e" (UID: "db4fd273-0b92-483d-b8c6-5f558f260a3e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.016673 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl" (OuterVolumeSpecName: "kube-api-access-qw8rl") pod "db4fd273-0b92-483d-b8c6-5f558f260a3e" (UID: "db4fd273-0b92-483d-b8c6-5f558f260a3e"). InnerVolumeSpecName "kube-api-access-qw8rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.032270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util" (OuterVolumeSpecName: "util") pod "db4fd273-0b92-483d-b8c6-5f558f260a3e" (UID: "db4fd273-0b92-483d-b8c6-5f558f260a3e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.112503 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-util\") on node \"crc\" DevicePath \"\"" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.112551 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/db4fd273-0b92-483d-b8c6-5f558f260a3e-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.112565 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw8rl\" (UniqueName: \"kubernetes.io/projected/db4fd273-0b92-483d-b8c6-5f558f260a3e-kube-api-access-qw8rl\") on node \"crc\" DevicePath \"\"" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.594162 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" event={"ID":"db4fd273-0b92-483d-b8c6-5f558f260a3e","Type":"ContainerDied","Data":"5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc"} Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.594224 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4912cd068ab4bc95b496fed686ee0af332cb3c95fbad82b57ba60dc6c443dc" Dec 03 23:57:40 crc kubenswrapper[4764]: I1203 23:57:40.594262 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.633293 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w"] Dec 03 23:57:43 crc kubenswrapper[4764]: E1203 23:57:43.633884 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="util" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.633899 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="util" Dec 03 23:57:43 crc kubenswrapper[4764]: E1203 23:57:43.633919 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="pull" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.633929 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="pull" Dec 03 23:57:43 crc kubenswrapper[4764]: E1203 23:57:43.633941 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="extract" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.633949 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="extract" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.634082 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4fd273-0b92-483d-b8c6-5f558f260a3e" containerName="extract" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.634537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.636932 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ctdd9" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.671851 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w"] Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.763161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92hb\" (UniqueName: \"kubernetes.io/projected/a69b6c88-819a-444b-8eff-4b629d4d4c87-kube-api-access-f92hb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-b5s5w\" (UID: \"a69b6c88-819a-444b-8eff-4b629d4d4c87\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.864998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f92hb\" (UniqueName: \"kubernetes.io/projected/a69b6c88-819a-444b-8eff-4b629d4d4c87-kube-api-access-f92hb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-b5s5w\" (UID: \"a69b6c88-819a-444b-8eff-4b629d4d4c87\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.887745 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92hb\" (UniqueName: \"kubernetes.io/projected/a69b6c88-819a-444b-8eff-4b629d4d4c87-kube-api-access-f92hb\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-b5s5w\" (UID: \"a69b6c88-819a-444b-8eff-4b629d4d4c87\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:43 crc kubenswrapper[4764]: I1203 23:57:43.952916 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:44 crc kubenswrapper[4764]: I1203 23:57:44.510150 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w"] Dec 03 23:57:44 crc kubenswrapper[4764]: W1203 23:57:44.522335 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda69b6c88_819a_444b_8eff_4b629d4d4c87.slice/crio-bd0bf4401b82e082cca100b55d9dd6732c1ad4208d7f9ad33319afc7f1b7e6af WatchSource:0}: Error finding container bd0bf4401b82e082cca100b55d9dd6732c1ad4208d7f9ad33319afc7f1b7e6af: Status 404 returned error can't find the container with id bd0bf4401b82e082cca100b55d9dd6732c1ad4208d7f9ad33319afc7f1b7e6af Dec 03 23:57:44 crc kubenswrapper[4764]: I1203 23:57:44.621366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" event={"ID":"a69b6c88-819a-444b-8eff-4b629d4d4c87","Type":"ContainerStarted","Data":"bd0bf4401b82e082cca100b55d9dd6732c1ad4208d7f9ad33319afc7f1b7e6af"} Dec 03 23:57:49 crc kubenswrapper[4764]: I1203 23:57:49.655300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" event={"ID":"a69b6c88-819a-444b-8eff-4b629d4d4c87","Type":"ContainerStarted","Data":"c7f5cbd07d595f9ffebcc4cc7efae1ff7ea307196bfa229c74af06ff7e1f06e8"} Dec 03 23:57:49 crc kubenswrapper[4764]: I1203 23:57:49.655671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:57:49 crc kubenswrapper[4764]: I1203 23:57:49.706094 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" podStartSLOduration=2.225098601 podStartE2EDuration="6.706060103s" podCreationTimestamp="2025-12-03 23:57:43 +0000 UTC" firstStartedPulling="2025-12-03 23:57:44.543813326 +0000 UTC m=+1000.305137737" lastFinishedPulling="2025-12-03 23:57:49.024774828 +0000 UTC m=+1004.786099239" observedRunningTime="2025-12-03 23:57:49.700683611 +0000 UTC m=+1005.462008072" watchObservedRunningTime="2025-12-03 23:57:49.706060103 +0000 UTC m=+1005.467384554" Dec 03 23:57:50 crc kubenswrapper[4764]: I1203 23:57:50.868868 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:57:50 crc kubenswrapper[4764]: I1203 23:57:50.869343 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:58:03 crc kubenswrapper[4764]: I1203 23:58:03.957227 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-b5s5w" Dec 03 23:58:20 crc kubenswrapper[4764]: I1203 23:58:20.868393 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:58:20 crc kubenswrapper[4764]: I1203 23:58:20.868885 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.194271 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.195481 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.199822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.199947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7q7cd" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.200934 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.207770 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.208888 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-gm9jb" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.220488 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfw8s\" (UniqueName: \"kubernetes.io/projected/94c55b91-cbc9-47c5-8abc-5140aeebf8d0-kube-api-access-xfw8s\") pod \"barbican-operator-controller-manager-7d9dfd778-7kt8t\" (UID: \"94c55b91-cbc9-47c5-8abc-5140aeebf8d0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.220607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwjf5\" (UniqueName: \"kubernetes.io/projected/ad7f7e9e-482b-415c-bf8d-02c9efbe387d-kube-api-access-jwjf5\") pod \"cinder-operator-controller-manager-859b6ccc6-bvqgr\" (UID: \"ad7f7e9e-482b-415c-bf8d-02c9efbe387d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.220803 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.222022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.223657 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2zqz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.228747 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.235298 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.236648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.238946 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-ntrs5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.242624 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.255610 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.256618 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.258422 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wjph8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.264638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.279449 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.297469 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.298621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.303487 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.304467 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.308974 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.313933 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.314124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-tm7nq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.314225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-x8jnv" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.317351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.322952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfw8s\" (UniqueName: \"kubernetes.io/projected/94c55b91-cbc9-47c5-8abc-5140aeebf8d0-kube-api-access-xfw8s\") pod \"barbican-operator-controller-manager-7d9dfd778-7kt8t\" (UID: \"94c55b91-cbc9-47c5-8abc-5140aeebf8d0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.323047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwjf5\" (UniqueName: \"kubernetes.io/projected/ad7f7e9e-482b-415c-bf8d-02c9efbe387d-kube-api-access-jwjf5\") pod \"cinder-operator-controller-manager-859b6ccc6-bvqgr\" (UID: \"ad7f7e9e-482b-415c-bf8d-02c9efbe387d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.356685 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.366004 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.369079 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfw8s\" (UniqueName: \"kubernetes.io/projected/94c55b91-cbc9-47c5-8abc-5140aeebf8d0-kube-api-access-xfw8s\") pod \"barbican-operator-controller-manager-7d9dfd778-7kt8t\" (UID: \"94c55b91-cbc9-47c5-8abc-5140aeebf8d0\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.369612 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gr9nq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.375184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwjf5\" (UniqueName: \"kubernetes.io/projected/ad7f7e9e-482b-415c-bf8d-02c9efbe387d-kube-api-access-jwjf5\") pod \"cinder-operator-controller-manager-859b6ccc6-bvqgr\" (UID: \"ad7f7e9e-482b-415c-bf8d-02c9efbe387d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.397551 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.419776 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.421154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.424912 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-pkrjs" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.425627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94bs\" (UniqueName: \"kubernetes.io/projected/06f61b54-9d32-467a-be0b-07f8fdf867aa-kube-api-access-c94bs\") pod \"glance-operator-controller-manager-77987cd8cd-qxhzq\" (UID: \"06f61b54-9d32-467a-be0b-07f8fdf867aa\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.425656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6gm9\" (UniqueName: \"kubernetes.io/projected/f2ff43e5-0f12-4008-a286-e6872cf78923-kube-api-access-t6gm9\") pod \"heat-operator-controller-manager-5f64f6f8bb-k66k8\" (UID: \"f2ff43e5-0f12-4008-a286-e6872cf78923\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.425690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g95wd\" (UniqueName: \"kubernetes.io/projected/4882c949-7a46-408b-a5ee-fc0fdcc6b291-kube-api-access-g95wd\") pod \"designate-operator-controller-manager-78b4bc895b-q2lz5\" (UID: \"4882c949-7a46-408b-a5ee-fc0fdcc6b291\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.425710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9kh6\" (UniqueName: \"kubernetes.io/projected/37b7d1f8-a668-44e9-af8d-0be7555bf2f6-kube-api-access-s9kh6\") pod \"horizon-operator-controller-manager-68c6d99b8f-nt8s2\" (UID: \"37b7d1f8-a668-44e9-af8d-0be7555bf2f6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.427599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.427647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69gt\" (UniqueName: \"kubernetes.io/projected/3178ac5b-9384-4653-bbc2-713a718eac88-kube-api-access-n69gt\") pod \"ironic-operator-controller-manager-6c548fd776-dt6f8\" (UID: \"3178ac5b-9384-4653-bbc2-713a718eac88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.427685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99q8x\" (UniqueName: \"kubernetes.io/projected/ebef9632-34f5-48ce-9a64-c76cf619498e-kube-api-access-99q8x\") pod \"keystone-operator-controller-manager-7765d96ddf-gp7h7\" (UID: \"ebef9632-34f5-48ce-9a64-c76cf619498e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.427707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbs5\" (UniqueName: \"kubernetes.io/projected/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-kube-api-access-tpbs5\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.436789 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.446072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.448393 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nth92" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.485775 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.499307 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.519333 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.522632 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.524833 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-shspl" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528724 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g95wd\" (UniqueName: \"kubernetes.io/projected/4882c949-7a46-408b-a5ee-fc0fdcc6b291-kube-api-access-g95wd\") pod \"designate-operator-controller-manager-78b4bc895b-q2lz5\" (UID: \"4882c949-7a46-408b-a5ee-fc0fdcc6b291\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9kh6\" (UniqueName: \"kubernetes.io/projected/37b7d1f8-a668-44e9-af8d-0be7555bf2f6-kube-api-access-s9kh6\") pod \"horizon-operator-controller-manager-68c6d99b8f-nt8s2\" (UID: \"37b7d1f8-a668-44e9-af8d-0be7555bf2f6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528798 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528837 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69gt\" (UniqueName: \"kubernetes.io/projected/3178ac5b-9384-4653-bbc2-713a718eac88-kube-api-access-n69gt\") pod \"ironic-operator-controller-manager-6c548fd776-dt6f8\" (UID: \"3178ac5b-9384-4653-bbc2-713a718eac88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99q8x\" (UniqueName: \"kubernetes.io/projected/ebef9632-34f5-48ce-9a64-c76cf619498e-kube-api-access-99q8x\") pod \"keystone-operator-controller-manager-7765d96ddf-gp7h7\" (UID: \"ebef9632-34f5-48ce-9a64-c76cf619498e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbs5\" (UniqueName: \"kubernetes.io/projected/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-kube-api-access-tpbs5\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94bs\" (UniqueName: \"kubernetes.io/projected/06f61b54-9d32-467a-be0b-07f8fdf867aa-kube-api-access-c94bs\") pod \"glance-operator-controller-manager-77987cd8cd-qxhzq\" (UID: \"06f61b54-9d32-467a-be0b-07f8fdf867aa\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.528960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6gm9\" (UniqueName: \"kubernetes.io/projected/f2ff43e5-0f12-4008-a286-e6872cf78923-kube-api-access-t6gm9\") pod \"heat-operator-controller-manager-5f64f6f8bb-k66k8\" (UID: \"f2ff43e5-0f12-4008-a286-e6872cf78923\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:23 crc kubenswrapper[4764]: E1203 23:58:23.529680 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:23 crc kubenswrapper[4764]: E1203 23:58:23.529763 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:24.029706612 +0000 UTC m=+1039.791031023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.536359 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.552741 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.602818 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6gm9\" (UniqueName: \"kubernetes.io/projected/f2ff43e5-0f12-4008-a286-e6872cf78923-kube-api-access-t6gm9\") pod \"heat-operator-controller-manager-5f64f6f8bb-k66k8\" (UID: \"f2ff43e5-0f12-4008-a286-e6872cf78923\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.604267 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.606822 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.618140 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9kh6\" (UniqueName: \"kubernetes.io/projected/37b7d1f8-a668-44e9-af8d-0be7555bf2f6-kube-api-access-s9kh6\") pod \"horizon-operator-controller-manager-68c6d99b8f-nt8s2\" (UID: \"37b7d1f8-a668-44e9-af8d-0be7555bf2f6\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.618946 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99q8x\" (UniqueName: \"kubernetes.io/projected/ebef9632-34f5-48ce-9a64-c76cf619498e-kube-api-access-99q8x\") pod \"keystone-operator-controller-manager-7765d96ddf-gp7h7\" (UID: \"ebef9632-34f5-48ce-9a64-c76cf619498e\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.622801 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.623982 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.628082 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mcvwp" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.629516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbs5\" (UniqueName: \"kubernetes.io/projected/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-kube-api-access-tpbs5\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.629836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xql\" (UniqueName: \"kubernetes.io/projected/984c6845-3698-46c3-9d88-416635322b98-kube-api-access-f9xql\") pod \"mariadb-operator-controller-manager-56bbcc9d85-xrd58\" (UID: \"984c6845-3698-46c3-9d88-416635322b98\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.629936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6wn\" (UniqueName: \"kubernetes.io/projected/c4162b99-a9c7-471c-a408-058ffb74fe69-kube-api-access-kk6wn\") pod \"manila-operator-controller-manager-7c79b5df47-qwfd7\" (UID: \"c4162b99-a9c7-471c-a408-058ffb74fe69\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.634900 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g95wd\" (UniqueName: \"kubernetes.io/projected/4882c949-7a46-408b-a5ee-fc0fdcc6b291-kube-api-access-g95wd\") pod \"designate-operator-controller-manager-78b4bc895b-q2lz5\" (UID: \"4882c949-7a46-408b-a5ee-fc0fdcc6b291\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.636695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.638801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94bs\" (UniqueName: \"kubernetes.io/projected/06f61b54-9d32-467a-be0b-07f8fdf867aa-kube-api-access-c94bs\") pod \"glance-operator-controller-manager-77987cd8cd-qxhzq\" (UID: \"06f61b54-9d32-467a-be0b-07f8fdf867aa\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.662286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69gt\" (UniqueName: \"kubernetes.io/projected/3178ac5b-9384-4653-bbc2-713a718eac88-kube-api-access-n69gt\") pod \"ironic-operator-controller-manager-6c548fd776-dt6f8\" (UID: \"3178ac5b-9384-4653-bbc2-713a718eac88\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.662350 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-975dt"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.663381 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.671646 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-tbxq2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.672674 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zj7br"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.673975 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.682554 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-k5wnd" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.683142 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-975dt"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.705765 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zj7br"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.708781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.719356 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.720641 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.723092 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-7rkd2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.729652 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.730926 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.731911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xql\" (UniqueName: \"kubernetes.io/projected/984c6845-3698-46c3-9d88-416635322b98-kube-api-access-f9xql\") pod \"mariadb-operator-controller-manager-56bbcc9d85-xrd58\" (UID: \"984c6845-3698-46c3-9d88-416635322b98\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.731983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brg8\" (UniqueName: \"kubernetes.io/projected/64d40f74-d89f-4e84-bb91-ff8cdcfdc747-kube-api-access-7brg8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vglgf\" (UID: \"64d40f74-d89f-4e84-bb91-ff8cdcfdc747\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.732023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6wn\" (UniqueName: \"kubernetes.io/projected/c4162b99-a9c7-471c-a408-058ffb74fe69-kube-api-access-kk6wn\") pod \"manila-operator-controller-manager-7c79b5df47-qwfd7\" (UID: \"c4162b99-a9c7-471c-a408-058ffb74fe69\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.744462 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-9cjtv" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.744615 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.747006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.750087 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.766346 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-k5sms"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.767352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.771084 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6wn\" (UniqueName: \"kubernetes.io/projected/c4162b99-a9c7-471c-a408-058ffb74fe69-kube-api-access-kk6wn\") pod \"manila-operator-controller-manager-7c79b5df47-qwfd7\" (UID: \"c4162b99-a9c7-471c-a408-058ffb74fe69\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.773891 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.777241 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sn5ck" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.797319 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xql\" (UniqueName: \"kubernetes.io/projected/984c6845-3698-46c3-9d88-416635322b98-kube-api-access-f9xql\") pod \"mariadb-operator-controller-manager-56bbcc9d85-xrd58\" (UID: \"984c6845-3698-46c3-9d88-416635322b98\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.810005 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-k5sms"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.844260 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.845360 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdr5x\" (UniqueName: \"kubernetes.io/projected/f18b5092-5d70-482a-af1f-be661a68701e-kube-api-access-bdr5x\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brg8\" (UniqueName: \"kubernetes.io/projected/64d40f74-d89f-4e84-bb91-ff8cdcfdc747-kube-api-access-7brg8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vglgf\" (UID: \"64d40f74-d89f-4e84-bb91-ff8cdcfdc747\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vdq\" (UniqueName: \"kubernetes.io/projected/2ccf4351-3ae2-432d-ae11-1a07dab689ae-kube-api-access-94vdq\") pod \"ovn-operator-controller-manager-b6456fdb6-2hmdx\" (UID: \"2ccf4351-3ae2-432d-ae11-1a07dab689ae\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf2b6\" (UniqueName: \"kubernetes.io/projected/e69ad255-4b3d-4c49-ad8a-59850f846c00-kube-api-access-tf2b6\") pod \"octavia-operator-controller-manager-998648c74-zj7br\" (UID: \"e69ad255-4b3d-4c49-ad8a-59850f846c00\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.846521 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lzg9\" (UniqueName: \"kubernetes.io/projected/2ec14efa-ac80-45f6-bcd2-20b404087776-kube-api-access-2lzg9\") pod \"nova-operator-controller-manager-697bc559fc-975dt\" (UID: \"2ec14efa-ac80-45f6-bcd2-20b404087776\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.850959 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-lrtkr" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.860117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.874791 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.875113 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.890645 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.892168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brg8\" (UniqueName: \"kubernetes.io/projected/64d40f74-d89f-4e84-bb91-ff8cdcfdc747-kube-api-access-7brg8\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vglgf\" (UID: \"64d40f74-d89f-4e84-bb91-ff8cdcfdc747\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.901380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.904831 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-rjrmg" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.914022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.925349 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952293 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94vdq\" (UniqueName: \"kubernetes.io/projected/2ccf4351-3ae2-432d-ae11-1a07dab689ae-kube-api-access-94vdq\") pod \"ovn-operator-controller-manager-b6456fdb6-2hmdx\" (UID: \"2ccf4351-3ae2-432d-ae11-1a07dab689ae\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tf2b6\" (UniqueName: \"kubernetes.io/projected/e69ad255-4b3d-4c49-ad8a-59850f846c00-kube-api-access-tf2b6\") pod \"octavia-operator-controller-manager-998648c74-zj7br\" (UID: \"e69ad255-4b3d-4c49-ad8a-59850f846c00\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952408 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6bsx\" (UniqueName: \"kubernetes.io/projected/ee4f58e7-bb44-483d-9ab4-c1e447c5e68c-kube-api-access-r6bsx\") pod \"placement-operator-controller-manager-78f8948974-k5sms\" (UID: \"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lzg9\" (UniqueName: \"kubernetes.io/projected/2ec14efa-ac80-45f6-bcd2-20b404087776-kube-api-access-2lzg9\") pod \"nova-operator-controller-manager-697bc559fc-975dt\" (UID: \"2ec14efa-ac80-45f6-bcd2-20b404087776\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952460 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdzl\" (UniqueName: \"kubernetes.io/projected/0f640363-0b23-4625-bf4d-2829b924640d-kube-api-access-hcdzl\") pod \"swift-operator-controller-manager-5f8c65bbfc-vzgmn\" (UID: \"0f640363-0b23-4625-bf4d-2829b924640d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952489 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.952531 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdr5x\" (UniqueName: \"kubernetes.io/projected/f18b5092-5d70-482a-af1f-be661a68701e-kube-api-access-bdr5x\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.953319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:23 crc kubenswrapper[4764]: E1203 23:58:23.953947 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:23 crc kubenswrapper[4764]: E1203 23:58:23.953993 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:24.453975578 +0000 UTC m=+1040.215299989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.955587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.965687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.971702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-w2wxv" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.975971 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc"] Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.977868 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.990183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lzg9\" (UniqueName: \"kubernetes.io/projected/2ec14efa-ac80-45f6-bcd2-20b404087776-kube-api-access-2lzg9\") pod \"nova-operator-controller-manager-697bc559fc-975dt\" (UID: \"2ec14efa-ac80-45f6-bcd2-20b404087776\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.993686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf2b6\" (UniqueName: \"kubernetes.io/projected/e69ad255-4b3d-4c49-ad8a-59850f846c00-kube-api-access-tf2b6\") pod \"octavia-operator-controller-manager-998648c74-zj7br\" (UID: \"e69ad255-4b3d-4c49-ad8a-59850f846c00\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.996330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vdq\" (UniqueName: \"kubernetes.io/projected/2ccf4351-3ae2-432d-ae11-1a07dab689ae-kube-api-access-94vdq\") pod \"ovn-operator-controller-manager-b6456fdb6-2hmdx\" (UID: \"2ccf4351-3ae2-432d-ae11-1a07dab689ae\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:23 crc kubenswrapper[4764]: I1203 23:58:23.996734 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdr5x\" (UniqueName: \"kubernetes.io/projected/f18b5092-5d70-482a-af1f-be661a68701e-kube-api-access-bdr5x\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.008816 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.011221 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.014302 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.017044 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.024736 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.026759 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rzqjx" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.055008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9p8x\" (UniqueName: \"kubernetes.io/projected/d703896f-745b-4359-8d2a-2c4b7cf5d062-kube-api-access-h9p8x\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fj6g2\" (UID: \"d703896f-745b-4359-8d2a-2c4b7cf5d062\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.055114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6bsx\" (UniqueName: \"kubernetes.io/projected/ee4f58e7-bb44-483d-9ab4-c1e447c5e68c-kube-api-access-r6bsx\") pod \"placement-operator-controller-manager-78f8948974-k5sms\" (UID: \"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.055140 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdzl\" (UniqueName: \"kubernetes.io/projected/0f640363-0b23-4625-bf4d-2829b924640d-kube-api-access-hcdzl\") pod \"swift-operator-controller-manager-5f8c65bbfc-vzgmn\" (UID: \"0f640363-0b23-4625-bf4d-2829b924640d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.055206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.055704 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.055783 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:25.055765613 +0000 UTC m=+1040.817090024 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.071081 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.089331 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.093638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdzl\" (UniqueName: \"kubernetes.io/projected/0f640363-0b23-4625-bf4d-2829b924640d-kube-api-access-hcdzl\") pod \"swift-operator-controller-manager-5f8c65bbfc-vzgmn\" (UID: \"0f640363-0b23-4625-bf4d-2829b924640d\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.119426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6bsx\" (UniqueName: \"kubernetes.io/projected/ee4f58e7-bb44-483d-9ab4-c1e447c5e68c-kube-api-access-r6bsx\") pod \"placement-operator-controller-manager-78f8948974-k5sms\" (UID: \"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.146340 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.152223 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.153194 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.156748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-n6bjz" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.156922 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.157046 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.157758 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4v9s\" (UniqueName: \"kubernetes.io/projected/247a518a-17e9-482b-bab7-832b31fa99e1-kube-api-access-s4v9s\") pod \"test-operator-controller-manager-5854674fcc-gd8dc\" (UID: \"247a518a-17e9-482b-bab7-832b31fa99e1\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.157836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9p8x\" (UniqueName: \"kubernetes.io/projected/d703896f-745b-4359-8d2a-2c4b7cf5d062-kube-api-access-h9p8x\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fj6g2\" (UID: \"d703896f-745b-4359-8d2a-2c4b7cf5d062\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.157908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvnkl\" (UniqueName: \"kubernetes.io/projected/ae29fd3c-f586-427c-a482-fdc2f609aa25-kube-api-access-xvnkl\") pod \"watcher-operator-controller-manager-769dc69bc-s2rm5\" (UID: \"ae29fd3c-f586-427c-a482-fdc2f609aa25\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.164192 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.177807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9p8x\" (UniqueName: \"kubernetes.io/projected/d703896f-745b-4359-8d2a-2c4b7cf5d062-kube-api-access-h9p8x\") pod \"telemetry-operator-controller-manager-76cc84c6bb-fj6g2\" (UID: \"d703896f-745b-4359-8d2a-2c4b7cf5d062\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.187953 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.192841 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.193804 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.201445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bxh5f" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.215562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.258865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.258926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvnkl\" (UniqueName: \"kubernetes.io/projected/ae29fd3c-f586-427c-a482-fdc2f609aa25-kube-api-access-xvnkl\") pod \"watcher-operator-controller-manager-769dc69bc-s2rm5\" (UID: \"ae29fd3c-f586-427c-a482-fdc2f609aa25\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.258948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sfql\" (UniqueName: \"kubernetes.io/projected/7c88c833-c710-44c7-9bfb-a684a7f39c39-kube-api-access-5sfql\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.258974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4v9s\" (UniqueName: \"kubernetes.io/projected/247a518a-17e9-482b-bab7-832b31fa99e1-kube-api-access-s4v9s\") pod \"test-operator-controller-manager-5854674fcc-gd8dc\" (UID: \"247a518a-17e9-482b-bab7-832b31fa99e1\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.259041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.284201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4v9s\" (UniqueName: \"kubernetes.io/projected/247a518a-17e9-482b-bab7-832b31fa99e1-kube-api-access-s4v9s\") pod \"test-operator-controller-manager-5854674fcc-gd8dc\" (UID: \"247a518a-17e9-482b-bab7-832b31fa99e1\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.293401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvnkl\" (UniqueName: \"kubernetes.io/projected/ae29fd3c-f586-427c-a482-fdc2f609aa25-kube-api-access-xvnkl\") pod \"watcher-operator-controller-manager-769dc69bc-s2rm5\" (UID: \"ae29fd3c-f586-427c-a482-fdc2f609aa25\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.315234 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.361945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.362030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsw49\" (UniqueName: \"kubernetes.io/projected/d4dea3da-3ffa-4da1-9a93-5f3233112a23-kube-api-access-bsw49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p7ptg\" (UID: \"d4dea3da-3ffa-4da1-9a93-5f3233112a23\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.362065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.362119 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sfql\" (UniqueName: \"kubernetes.io/projected/7c88c833-c710-44c7-9bfb-a684a7f39c39-kube-api-access-5sfql\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.362528 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.362575 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:24.86255779 +0000 UTC m=+1040.623882211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "metrics-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.362771 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.362802 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:24.862793556 +0000 UTC m=+1040.624117967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.384468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sfql\" (UniqueName: \"kubernetes.io/projected/7c88c833-c710-44c7-9bfb-a684a7f39c39-kube-api-access-5sfql\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.404596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.424501 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.453578 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.459155 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.468614 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.468706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsw49\" (UniqueName: \"kubernetes.io/projected/d4dea3da-3ffa-4da1-9a93-5f3233112a23-kube-api-access-bsw49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p7ptg\" (UID: \"d4dea3da-3ffa-4da1-9a93-5f3233112a23\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.469212 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.469294 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:25.469257105 +0000 UTC m=+1041.230581516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.486510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsw49\" (UniqueName: \"kubernetes.io/projected/d4dea3da-3ffa-4da1-9a93-5f3233112a23-kube-api-access-bsw49\") pod \"rabbitmq-cluster-operator-manager-668c99d594-p7ptg\" (UID: \"d4dea3da-3ffa-4da1-9a93-5f3233112a23\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.566509 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.703836 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.866551 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.873284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.873429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.873553 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.873618 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:25.873602323 +0000 UTC m=+1041.634926734 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "metrics-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.873704 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: E1203 23:58:24.873788 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:25.873770017 +0000 UTC m=+1041.635094428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.910143 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" event={"ID":"64d40f74-d89f-4e84-bb91-ff8cdcfdc747","Type":"ContainerStarted","Data":"60bf27166169a88ae221c762239b0a8dc52f041ebe72a7246c519371f29cfec5"} Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.911396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" event={"ID":"94c55b91-cbc9-47c5-8abc-5140aeebf8d0","Type":"ContainerStarted","Data":"b064ca937bc2f6633ef15235304ab9eadf4a6865176748530302a421499cbfb1"} Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.912708 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" event={"ID":"ad7f7e9e-482b-415c-bf8d-02c9efbe387d","Type":"ContainerStarted","Data":"4f90fdd00ec3a42f2da878cfe678965a295a5c4dab0bf5654e51a5c32f37e34a"} Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.913594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" event={"ID":"f2ff43e5-0f12-4008-a286-e6872cf78923","Type":"ContainerStarted","Data":"c1af73834cb974c0449ca9edb36467201cc783a41a516bf75325166fcae0c941"} Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.960585 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7"] Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.966566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58"] Dec 03 23:58:24 crc kubenswrapper[4764]: W1203 23:58:24.969338 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3178ac5b_9384_4653_bbc2_713a718eac88.slice/crio-7d92eeea5ef813a583b494bd75b032901adaae7f62f3f118bef876790d7460be WatchSource:0}: Error finding container 7d92eeea5ef813a583b494bd75b032901adaae7f62f3f118bef876790d7460be: Status 404 returned error can't find the container with id 7d92eeea5ef813a583b494bd75b032901adaae7f62f3f118bef876790d7460be Dec 03 23:58:24 crc kubenswrapper[4764]: I1203 23:58:24.975466 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.079877 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.080039 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.080084 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:27.080069283 +0000 UTC m=+1042.841393694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.088474 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.106549 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.113076 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.117977 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-zj7br"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.125386 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-975dt"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.128835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-k5sms"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.132228 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2"] Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.145870 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2lzg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-975dt_openstack-operators(2ec14efa-ac80-45f6-bcd2-20b404087776): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.153727 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2lzg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-975dt_openstack-operators(2ec14efa-ac80-45f6-bcd2-20b404087776): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.155032 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" podUID="2ec14efa-ac80-45f6-bcd2-20b404087776" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.287745 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bsw49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-p7ptg_openstack-operators(d4dea3da-3ffa-4da1-9a93-5f3233112a23): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.295139 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg"] Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.295238 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" podUID="d4dea3da-3ffa-4da1-9a93-5f3233112a23" Dec 03 23:58:25 crc kubenswrapper[4764]: W1203 23:58:25.311236 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ccf4351_3ae2_432d_ae11_1a07dab689ae.slice/crio-23a6c9c3b5b9314bcfb112a665cfc23acbc6a2347d0193c64de03c028fae237d WatchSource:0}: Error finding container 23a6c9c3b5b9314bcfb112a665cfc23acbc6a2347d0193c64de03c028fae237d: Status 404 returned error can't find the container with id 23a6c9c3b5b9314bcfb112a665cfc23acbc6a2347d0193c64de03c028fae237d Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.313672 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94vdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2hmdx_openstack-operators(2ccf4351-3ae2-432d-ae11-1a07dab689ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.315836 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94vdq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2hmdx_openstack-operators(2ccf4351-3ae2-432d-ae11-1a07dab689ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.316588 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvnkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-s2rm5_openstack-operators(ae29fd3c-f586-427c-a482-fdc2f609aa25): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.317177 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" podUID="2ccf4351-3ae2-432d-ae11-1a07dab689ae" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.318333 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcdzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vzgmn_openstack-operators(0f640363-0b23-4625-bf4d-2829b924640d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.320650 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvnkl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-s2rm5_openstack-operators(ae29fd3c-f586-427c-a482-fdc2f609aa25): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.322924 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hcdzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-vzgmn_openstack-operators(0f640363-0b23-4625-bf4d-2829b924640d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.322984 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" podUID="ae29fd3c-f586-427c-a482-fdc2f609aa25" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.324027 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" podUID="0f640363-0b23-4625-bf4d-2829b924640d" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.324697 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4v9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-gd8dc_openstack-operators(247a518a-17e9-482b-bab7-832b31fa99e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.326146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2"] Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.327254 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s4v9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-gd8dc_openstack-operators(247a518a-17e9-482b-bab7-832b31fa99e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.328455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" podUID="247a518a-17e9-482b-bab7-832b31fa99e1" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.333368 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.346633 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.354876 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.360601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc"] Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.488430 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.488637 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.488707 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:27.488687776 +0000 UTC m=+1043.250012187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.893101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.893152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.893295 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.893336 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:27.893323801 +0000 UTC m=+1043.654648212 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.893429 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.893448 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:27.893442294 +0000 UTC m=+1043.654766705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "metrics-server-cert" not found Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.923242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" event={"ID":"4882c949-7a46-408b-a5ee-fc0fdcc6b291","Type":"ContainerStarted","Data":"cbb4dcd1816ef3ef9e2c09785c22b125aca1b043557a2952a41873d5b9d10d76"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.924932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" event={"ID":"37b7d1f8-a668-44e9-af8d-0be7555bf2f6","Type":"ContainerStarted","Data":"aee1b048c0c4bac9be6c2a48c26749fc0d14a3d045b6b4f5008b5968a5e9c633"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.928114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" event={"ID":"984c6845-3698-46c3-9d88-416635322b98","Type":"ContainerStarted","Data":"bb0338dfda9457126a3824bdc8886bcc13a544362e442d2fd0d96d136640b9cc"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.930511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" event={"ID":"2ec14efa-ac80-45f6-bcd2-20b404087776","Type":"ContainerStarted","Data":"4b74df9206f0e27cb800030e960d7fb40646a44a1b252378ce49045e38bbffdb"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.931485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" event={"ID":"3178ac5b-9384-4653-bbc2-713a718eac88","Type":"ContainerStarted","Data":"7d92eeea5ef813a583b494bd75b032901adaae7f62f3f118bef876790d7460be"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.933468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" event={"ID":"d4dea3da-3ffa-4da1-9a93-5f3233112a23","Type":"ContainerStarted","Data":"346144e6295705944c251c2301c28279de8c0f5228c7aa38ddd0072df5bc0ee0"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.933582 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" podUID="2ec14efa-ac80-45f6-bcd2-20b404087776" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.934559 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" event={"ID":"06f61b54-9d32-467a-be0b-07f8fdf867aa","Type":"ContainerStarted","Data":"e8ca50dee7f3ab69834d64e58c398f9a6653b96e9fd95625271eebd5a277e597"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.935960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" event={"ID":"ae29fd3c-f586-427c-a482-fdc2f609aa25","Type":"ContainerStarted","Data":"08f45f3cb5208fdc81f8e109f5cb728f4c501193faa966b61c1d20c5f6657bf1"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.939091 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" podUID="d4dea3da-3ffa-4da1-9a93-5f3233112a23" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.939885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" event={"ID":"ebef9632-34f5-48ce-9a64-c76cf619498e","Type":"ContainerStarted","Data":"dd85dff25b0fc3931b3223d0c5e70cf4703a6920020edac17b9479a9393818d2"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.942564 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" podUID="ae29fd3c-f586-427c-a482-fdc2f609aa25" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.943500 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" event={"ID":"d703896f-745b-4359-8d2a-2c4b7cf5d062","Type":"ContainerStarted","Data":"9461ebf88bf6eb2a7558407f512b218df535cac7b5ffa33b6d66bb149c9fd43b"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.945828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" event={"ID":"247a518a-17e9-482b-bab7-832b31fa99e1","Type":"ContainerStarted","Data":"1c91d4e1a9947d03715daa5544673697b811595e7f3b21e6613f90e78820f010"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.948334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" event={"ID":"e69ad255-4b3d-4c49-ad8a-59850f846c00","Type":"ContainerStarted","Data":"68a19a711fc4903110dffc23e2a1ae55047ee0d7bc8ab532d3d5a872a6a25016"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.948698 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" podUID="247a518a-17e9-482b-bab7-832b31fa99e1" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.952134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" event={"ID":"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c","Type":"ContainerStarted","Data":"cd85915c31058d4bab705039874894f60ca429fe74275d5def4088b3ba8305a4"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.954294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" event={"ID":"2ccf4351-3ae2-432d-ae11-1a07dab689ae","Type":"ContainerStarted","Data":"23a6c9c3b5b9314bcfb112a665cfc23acbc6a2347d0193c64de03c028fae237d"} Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.957533 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" event={"ID":"c4162b99-a9c7-471c-a408-058ffb74fe69","Type":"ContainerStarted","Data":"190ba591c997351aff3aab50542ae153dad757735573d46c4b77f38445fafd91"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.958054 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" podUID="2ccf4351-3ae2-432d-ae11-1a07dab689ae" Dec 03 23:58:25 crc kubenswrapper[4764]: I1203 23:58:25.959009 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" event={"ID":"0f640363-0b23-4625-bf4d-2829b924640d","Type":"ContainerStarted","Data":"625535fea096772a96590e6cae76e70db241cd30bcd41185329c13c535405d8f"} Dec 03 23:58:25 crc kubenswrapper[4764]: E1203 23:58:25.966300 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" podUID="0f640363-0b23-4625-bf4d-2829b924640d" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.967968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" podUID="d4dea3da-3ffa-4da1-9a93-5f3233112a23" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.968018 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" podUID="247a518a-17e9-482b-bab7-832b31fa99e1" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.968053 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" podUID="2ec14efa-ac80-45f6-bcd2-20b404087776" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.968065 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" podUID="ae29fd3c-f586-427c-a482-fdc2f609aa25" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.968535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" podUID="2ccf4351-3ae2-432d-ae11-1a07dab689ae" Dec 03 23:58:26 crc kubenswrapper[4764]: E1203 23:58:26.969217 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" podUID="0f640363-0b23-4625-bf4d-2829b924640d" Dec 03 23:58:27 crc kubenswrapper[4764]: I1203 23:58:27.115508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.116374 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.116436 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:31.116420221 +0000 UTC m=+1046.877744632 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: I1203 23:58:27.522697 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.522896 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.522988 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:31.522963063 +0000 UTC m=+1047.284287494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: I1203 23:58:27.927444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:27 crc kubenswrapper[4764]: I1203 23:58:27.927506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.927641 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.927657 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.927693 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:31.927679001 +0000 UTC m=+1047.689003412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:27 crc kubenswrapper[4764]: E1203 23:58:27.927768 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:31.927746752 +0000 UTC m=+1047.689071183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "metrics-server-cert" not found Dec 03 23:58:31 crc kubenswrapper[4764]: I1203 23:58:31.188341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:31 crc kubenswrapper[4764]: E1203 23:58:31.188473 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:31 crc kubenswrapper[4764]: E1203 23:58:31.188856 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:39.188833713 +0000 UTC m=+1054.950158134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:31 crc kubenswrapper[4764]: I1203 23:58:31.595025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:31 crc kubenswrapper[4764]: E1203 23:58:31.595162 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:31 crc kubenswrapper[4764]: E1203 23:58:31.595212 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:39.59519696 +0000 UTC m=+1055.356521371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:32 crc kubenswrapper[4764]: I1203 23:58:32.002508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:32 crc kubenswrapper[4764]: I1203 23:58:32.002627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:32 crc kubenswrapper[4764]: E1203 23:58:32.002986 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:32 crc kubenswrapper[4764]: E1203 23:58:32.003083 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:40.003056795 +0000 UTC m=+1055.764381246 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:32 crc kubenswrapper[4764]: E1203 23:58:32.003165 4764 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 23:58:32 crc kubenswrapper[4764]: E1203 23:58:32.003446 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:40.003369112 +0000 UTC m=+1055.764693553 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "metrics-server-cert" not found Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.357509 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s9kh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-nt8s2_openstack-operators(37b7d1f8-a668-44e9-af8d-0be7555bf2f6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.357598 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g95wd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-q2lz5_openstack-operators(4882c949-7a46-408b-a5ee-fc0fdcc6b291): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.359900 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" podUID="4882c949-7a46-408b-a5ee-fc0fdcc6b291" Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.359946 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" podUID="37b7d1f8-a668-44e9-af8d-0be7555bf2f6" Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.382180 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xfw8s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-7kt8t_openstack-operators(94c55b91-cbc9-47c5-8abc-5140aeebf8d0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 23:58:38 crc kubenswrapper[4764]: E1203 23:58:38.384338 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" podUID="94c55b91-cbc9-47c5-8abc-5140aeebf8d0" Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.108725 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" event={"ID":"94c55b91-cbc9-47c5-8abc-5140aeebf8d0","Type":"ContainerStarted","Data":"ee3d16b06671e6d31f4690c61a989976fbf554247359cab0c1f13795f69f2dc7"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.108967 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.113442 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" podUID="94c55b91-cbc9-47c5-8abc-5140aeebf8d0" Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.135595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" event={"ID":"e69ad255-4b3d-4c49-ad8a-59850f846c00","Type":"ContainerStarted","Data":"adb8ae30021ad676cc7a0e7a2fc4ac19b87926d2384b5c848b2e398a8a37c85d"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.138560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" event={"ID":"64d40f74-d89f-4e84-bb91-ff8cdcfdc747","Type":"ContainerStarted","Data":"f7f08172a21c9f999d23ed3102bc6eaccf31ded313b013311134a2437ae81186"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.139423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" event={"ID":"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c","Type":"ContainerStarted","Data":"8f306e791ffcbe0ecac938e5435b3764b2b464299ad85497007c848e0cdb7f4d"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.140228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" event={"ID":"3178ac5b-9384-4653-bbc2-713a718eac88","Type":"ContainerStarted","Data":"a659ab361f6ec8d61667cc5801b1af9f0f439ae4ec0ec3129fad4d5ceadefd66"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.141045 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" event={"ID":"06f61b54-9d32-467a-be0b-07f8fdf867aa","Type":"ContainerStarted","Data":"084ec3b5b0d0e6873eb7a22e21b8c7b0847852401a81aeef7be1900acad1b4e4"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.141982 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" event={"ID":"ad7f7e9e-482b-415c-bf8d-02c9efbe387d","Type":"ContainerStarted","Data":"90922f52e8e47e30375ecad2d60826fff73b81a66daaae604a355ab21a58c1ce"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.152100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" event={"ID":"984c6845-3698-46c3-9d88-416635322b98","Type":"ContainerStarted","Data":"f838b861fba6117f50c0c21f6447ee5710eb0ad7eb21195c2a3db1438fd6c6ad"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.153974 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" event={"ID":"d703896f-745b-4359-8d2a-2c4b7cf5d062","Type":"ContainerStarted","Data":"de2c5d111d9f2582124651a74497fe1d8f3cf99414c27633651bdec6d1dd5701"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.154767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" event={"ID":"ebef9632-34f5-48ce-9a64-c76cf619498e","Type":"ContainerStarted","Data":"7a74c862f09a3c54a2b72211e6cd5483287f6a17c7910603a1e76e0001a2099a"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.155598 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" event={"ID":"37b7d1f8-a668-44e9-af8d-0be7555bf2f6","Type":"ContainerStarted","Data":"3832e0bf174fdc3df59016974a709b42bea0980a472a5c461b342ac2c3d59173"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.155691 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.157857 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" podUID="37b7d1f8-a668-44e9-af8d-0be7555bf2f6" Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.173401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" event={"ID":"c4162b99-a9c7-471c-a408-058ffb74fe69","Type":"ContainerStarted","Data":"17c524a0d471cea3265e5df86a50e01ba9cdb0456dd9a32d7e0875d91b7caed0"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.200452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" event={"ID":"4882c949-7a46-408b-a5ee-fc0fdcc6b291","Type":"ContainerStarted","Data":"b9e06f8819be3fff65a449ca79ecabba1256720706bbacc5c50b102dc89a35d9"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.201056 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.206248 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" podUID="4882c949-7a46-408b-a5ee-fc0fdcc6b291" Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.211296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.211838 4764 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.211899 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert podName:4e494874-aa22-4cbb-aef2-a20b3ad2eea3 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:55.211869301 +0000 UTC m=+1070.973193712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert") pod "infra-operator-controller-manager-57548d458d-nt8f5" (UID: "4e494874-aa22-4cbb-aef2-a20b3ad2eea3") : secret "infra-operator-webhook-server-cert" not found Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.219207 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" event={"ID":"f2ff43e5-0f12-4008-a286-e6872cf78923","Type":"ContainerStarted","Data":"a0763753cc7afa265ec50e1568448b7ba761b457ab778b7211ccd71d2feeee05"} Dec 03 23:58:39 crc kubenswrapper[4764]: I1203 23:58:39.616322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.616524 4764 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:39 crc kubenswrapper[4764]: E1203 23:58:39.616732 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert podName:f18b5092-5d70-482a-af1f-be661a68701e nodeName:}" failed. No retries permitted until 2025-12-03 23:58:55.616697421 +0000 UTC m=+1071.378021832 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" (UID: "f18b5092-5d70-482a-af1f-be661a68701e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 23:58:40 crc kubenswrapper[4764]: I1203 23:58:40.044683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:40 crc kubenswrapper[4764]: I1203 23:58:40.044766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:40 crc kubenswrapper[4764]: E1203 23:58:40.045039 4764 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 23:58:40 crc kubenswrapper[4764]: E1203 23:58:40.045160 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs podName:7c88c833-c710-44c7-9bfb-a684a7f39c39 nodeName:}" failed. No retries permitted until 2025-12-03 23:58:56.045133949 +0000 UTC m=+1071.806458400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-6ssgm" (UID: "7c88c833-c710-44c7-9bfb-a684a7f39c39") : secret "webhook-server-cert" not found Dec 03 23:58:40 crc kubenswrapper[4764]: I1203 23:58:40.058549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:40 crc kubenswrapper[4764]: E1203 23:58:40.232313 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" podUID="4882c949-7a46-408b-a5ee-fc0fdcc6b291" Dec 03 23:58:40 crc kubenswrapper[4764]: E1203 23:58:40.234562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" podUID="94c55b91-cbc9-47c5-8abc-5140aeebf8d0" Dec 03 23:58:40 crc kubenswrapper[4764]: E1203 23:58:40.235328 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" podUID="37b7d1f8-a668-44e9-af8d-0be7555bf2f6" Dec 03 23:58:43 crc kubenswrapper[4764]: I1203 23:58:43.525871 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" Dec 03 23:58:43 crc kubenswrapper[4764]: E1203 23:58:43.527878 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" podUID="94c55b91-cbc9-47c5-8abc-5140aeebf8d0" Dec 03 23:58:43 crc kubenswrapper[4764]: I1203 23:58:43.864005 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" Dec 03 23:58:43 crc kubenswrapper[4764]: E1203 23:58:43.866505 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" podUID="4882c949-7a46-408b-a5ee-fc0fdcc6b291" Dec 03 23:58:43 crc kubenswrapper[4764]: I1203 23:58:43.918001 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" Dec 03 23:58:43 crc kubenswrapper[4764]: E1203 23:58:43.921215 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" podUID="37b7d1f8-a668-44e9-af8d-0be7555bf2f6" Dec 03 23:58:50 crc kubenswrapper[4764]: I1203 23:58:50.869462 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 23:58:50 crc kubenswrapper[4764]: I1203 23:58:50.870243 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 23:58:50 crc kubenswrapper[4764]: I1203 23:58:50.870325 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 03 23:58:50 crc kubenswrapper[4764]: I1203 23:58:50.871245 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 23:58:50 crc kubenswrapper[4764]: I1203 23:58:50.871339 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9" gracePeriod=600 Dec 03 23:58:51 crc kubenswrapper[4764]: I1203 23:58:51.325407 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9" exitCode=0 Dec 03 23:58:51 crc kubenswrapper[4764]: I1203 23:58:51.325468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9"} Dec 03 23:58:51 crc kubenswrapper[4764]: I1203 23:58:51.325529 4764 scope.go:117] "RemoveContainer" containerID="1120b0acc6513dd274be5731fb29ccd1424c55fbae3e411c28a5dc8b386fb90b" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.365037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" event={"ID":"c4162b99-a9c7-471c-a408-058ffb74fe69","Type":"ContainerStarted","Data":"94db873986dec5e4a2a68c81b6c856f133829529898e838c03a929691a8732d3"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.365633 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.367330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" event={"ID":"06f61b54-9d32-467a-be0b-07f8fdf867aa","Type":"ContainerStarted","Data":"39a563e7f1fa7f2a3f5460500b9b9d848e3ad500ada4edb02d0f2fceb2561fb1"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.368087 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.371760 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.375403 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.376348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" event={"ID":"f2ff43e5-0f12-4008-a286-e6872cf78923","Type":"ContainerStarted","Data":"32369a1fcc996350ee40bec49d79269231beff730465cab656e8b27ec99c3e25"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.384669 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.389077 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-qwfd7" podStartSLOduration=2.866724769 podStartE2EDuration="30.38906179s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.117997102 +0000 UTC m=+1040.879321513" lastFinishedPulling="2025-12-03 23:58:52.640334103 +0000 UTC m=+1068.401658534" observedRunningTime="2025-12-03 23:58:53.38536372 +0000 UTC m=+1069.146688121" watchObservedRunningTime="2025-12-03 23:58:53.38906179 +0000 UTC m=+1069.150386201" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.392188 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.399881 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" event={"ID":"ae29fd3c-f586-427c-a482-fdc2f609aa25","Type":"ContainerStarted","Data":"e357f6a767c36881b8b830399190843193f24728b16b71e66ecbafb53d37fef3"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.448978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" event={"ID":"984c6845-3698-46c3-9d88-416635322b98","Type":"ContainerStarted","Data":"0eca8c68ce88b773459e8cee060a24b82f10706a7eaf73729bf9771789807231"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.451809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.461095 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.461516 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-qxhzq" podStartSLOduration=2.852313195 podStartE2EDuration="30.461501265s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.135683265 +0000 UTC m=+1040.897007666" lastFinishedPulling="2025-12-03 23:58:52.744871315 +0000 UTC m=+1068.506195736" observedRunningTime="2025-12-03 23:58:53.450583128 +0000 UTC m=+1069.211907539" watchObservedRunningTime="2025-12-03 23:58:53.461501265 +0000 UTC m=+1069.222825676" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.505781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" event={"ID":"64d40f74-d89f-4e84-bb91-ff8cdcfdc747","Type":"ContainerStarted","Data":"a72080668d82783d460b93197f88e51c4682921501937f3e9b40c580df3b3173"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.505815 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.512984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.514873 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-k66k8" podStartSLOduration=2.56680164 podStartE2EDuration="30.514854293s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.735037788 +0000 UTC m=+1040.496362199" lastFinishedPulling="2025-12-03 23:58:52.683090431 +0000 UTC m=+1068.444414852" observedRunningTime="2025-12-03 23:58:53.479482606 +0000 UTC m=+1069.240807017" watchObservedRunningTime="2025-12-03 23:58:53.514854293 +0000 UTC m=+1069.276178714" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.567999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.577882 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-xrd58" podStartSLOduration=2.89449426 podStartE2EDuration="30.577867327s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.956833063 +0000 UTC m=+1040.718157474" lastFinishedPulling="2025-12-03 23:58:52.64020612 +0000 UTC m=+1068.401530541" observedRunningTime="2025-12-03 23:58:53.572248759 +0000 UTC m=+1069.333573190" watchObservedRunningTime="2025-12-03 23:58:53.577867327 +0000 UTC m=+1069.339191738" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.593750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" event={"ID":"d703896f-745b-4359-8d2a-2c4b7cf5d062","Type":"ContainerStarted","Data":"04cc31b0baae7cf8bf7693aa5b6f79fffc4a326ebd03638b6cbc5d4a089aef1a"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.594548 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.611282 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.613194 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" event={"ID":"2ccf4351-3ae2-432d-ae11-1a07dab689ae","Type":"ContainerStarted","Data":"5a6ed81a352e6303cbccce2acd1461707bbbf779c5b570269120098cbcdc8954"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.642533 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vglgf" podStartSLOduration=2.774249494 podStartE2EDuration="30.642513361s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.875042589 +0000 UTC m=+1040.636367000" lastFinishedPulling="2025-12-03 23:58:52.743306456 +0000 UTC m=+1068.504630867" observedRunningTime="2025-12-03 23:58:53.640291707 +0000 UTC m=+1069.401616118" watchObservedRunningTime="2025-12-03 23:58:53.642513361 +0000 UTC m=+1069.403837762" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.645483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" event={"ID":"0f640363-0b23-4625-bf4d-2829b924640d","Type":"ContainerStarted","Data":"c1bc4ff2ef07fe564692572fd6839d4d80208847a368d721256fc8bcffc0eb35"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.674324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" event={"ID":"d4dea3da-3ffa-4da1-9a93-5f3233112a23","Type":"ContainerStarted","Data":"f5cc930782df8bfc6294f8bf3b1c8d7e9416a18063233dfe984d363fd4f68827"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.678604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-fj6g2" podStartSLOduration=3.287344117 podStartE2EDuration="30.678588665s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.31133674 +0000 UTC m=+1041.072661151" lastFinishedPulling="2025-12-03 23:58:52.702581288 +0000 UTC m=+1068.463905699" observedRunningTime="2025-12-03 23:58:53.675461958 +0000 UTC m=+1069.436786369" watchObservedRunningTime="2025-12-03 23:58:53.678588665 +0000 UTC m=+1069.439913076" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.710072 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-p7ptg" podStartSLOduration=2.397229209 podStartE2EDuration="29.710056696s" podCreationTimestamp="2025-12-03 23:58:24 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.287554667 +0000 UTC m=+1041.048879078" lastFinishedPulling="2025-12-03 23:58:52.600382164 +0000 UTC m=+1068.361706565" observedRunningTime="2025-12-03 23:58:53.70858343 +0000 UTC m=+1069.469907841" watchObservedRunningTime="2025-12-03 23:58:53.710056696 +0000 UTC m=+1069.471381107" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.711515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" event={"ID":"ebef9632-34f5-48ce-9a64-c76cf619498e","Type":"ContainerStarted","Data":"3b44155b279813876dc7805957895e1423ca57245e91f209f208e1babe66eeb5"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.713228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.721304 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.736918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" event={"ID":"247a518a-17e9-482b-bab7-832b31fa99e1","Type":"ContainerStarted","Data":"5eaf76096495b8c78087e8403996e83d9f034c808dc30fdfb639c37ac0c9b0ce"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.758663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" event={"ID":"e69ad255-4b3d-4c49-ad8a-59850f846c00","Type":"ContainerStarted","Data":"cffacaa6c949af358173c6ce701f0dc5cf6f8610debfada61415f00effc16ee0"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.759381 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.765837 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-gp7h7" podStartSLOduration=3.122786993 podStartE2EDuration="30.765819042s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.957308714 +0000 UTC m=+1040.718633135" lastFinishedPulling="2025-12-03 23:58:52.600340773 +0000 UTC m=+1068.361665184" observedRunningTime="2025-12-03 23:58:53.747988766 +0000 UTC m=+1069.509313167" watchObservedRunningTime="2025-12-03 23:58:53.765819042 +0000 UTC m=+1069.527143453" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.766101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" event={"ID":"ad7f7e9e-482b-415c-bf8d-02c9efbe387d","Type":"ContainerStarted","Data":"40da2c077ae0898e14790df62d2d81008b87d6d7872dd233e9655b59be3bf988"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.766376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.767064 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.770348 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.821182 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" event={"ID":"ee4f58e7-bb44-483d-9ab4-c1e447c5e68c","Type":"ContainerStarted","Data":"d62caed8e43827eaec38aaf9f71d96858036f568c8171aaa3c505ae59a91991c"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.821969 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.823827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" event={"ID":"2ec14efa-ac80-45f6-bcd2-20b404087776","Type":"ContainerStarted","Data":"fd0f17b8f54388206a77549fbc87429add3bea2a807fa3b0f5c870be9b218562"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.824366 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.825948 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.839474 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-bvqgr" podStartSLOduration=2.707830576 podStartE2EDuration="30.839456477s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.502792857 +0000 UTC m=+1040.264117268" lastFinishedPulling="2025-12-03 23:58:52.634418718 +0000 UTC m=+1068.395743169" observedRunningTime="2025-12-03 23:58:53.820663256 +0000 UTC m=+1069.581987677" watchObservedRunningTime="2025-12-03 23:58:53.839456477 +0000 UTC m=+1069.600780888" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.844801 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-zj7br" podStartSLOduration=3.314444221 podStartE2EDuration="30.844785138s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.117815898 +0000 UTC m=+1040.879140309" lastFinishedPulling="2025-12-03 23:58:52.648156795 +0000 UTC m=+1068.409481226" observedRunningTime="2025-12-03 23:58:53.839751304 +0000 UTC m=+1069.601075715" watchObservedRunningTime="2025-12-03 23:58:53.844785138 +0000 UTC m=+1069.606109549" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.860018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" event={"ID":"3178ac5b-9384-4653-bbc2-713a718eac88","Type":"ContainerStarted","Data":"2a321bf3694fc97977ef00bb1de3baa00981483696332878aa0f6715c3655ed7"} Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.861042 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.870071 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.888743 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-k5sms" podStartSLOduration=3.32466291 podStartE2EDuration="30.888729544s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.105351182 +0000 UTC m=+1040.866675603" lastFinishedPulling="2025-12-03 23:58:52.669417826 +0000 UTC m=+1068.430742237" observedRunningTime="2025-12-03 23:58:53.887891334 +0000 UTC m=+1069.649215765" watchObservedRunningTime="2025-12-03 23:58:53.888729544 +0000 UTC m=+1069.650053955" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.926811 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" podStartSLOduration=3.517204619 podStartE2EDuration="30.926795957s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.145657 +0000 UTC m=+1040.906981411" lastFinishedPulling="2025-12-03 23:58:52.555248338 +0000 UTC m=+1068.316572749" observedRunningTime="2025-12-03 23:58:53.923416834 +0000 UTC m=+1069.684741245" watchObservedRunningTime="2025-12-03 23:58:53.926795957 +0000 UTC m=+1069.688120368" Dec 03 23:58:53 crc kubenswrapper[4764]: I1203 23:58:53.962647 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-dt6f8" podStartSLOduration=3.241242046 podStartE2EDuration="30.962630185s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.971728918 +0000 UTC m=+1040.733053329" lastFinishedPulling="2025-12-03 23:58:52.693117057 +0000 UTC m=+1068.454441468" observedRunningTime="2025-12-03 23:58:53.96200941 +0000 UTC m=+1069.723333821" watchObservedRunningTime="2025-12-03 23:58:53.962630185 +0000 UTC m=+1069.723954586" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.869485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" event={"ID":"0f640363-0b23-4625-bf4d-2829b924640d","Type":"ContainerStarted","Data":"4afc14eca273c8964c8b6cf327860cfdedd5af3aea0ddb1c67d0b0356eea2fc2"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.869864 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.871649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" event={"ID":"2ec14efa-ac80-45f6-bcd2-20b404087776","Type":"ContainerStarted","Data":"e02e774dccf108daf1afb23acf3a987cc4390460189bf20111539e0c05e46c61"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.873889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" event={"ID":"94c55b91-cbc9-47c5-8abc-5140aeebf8d0","Type":"ContainerStarted","Data":"b8194982653d980d681a6bec4ed604aa869ea27c10bc323b255b1e5ab7b733aa"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.876023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" event={"ID":"ae29fd3c-f586-427c-a482-fdc2f609aa25","Type":"ContainerStarted","Data":"86c0f8a2f3e1728ab70e71e21c4feba77c06c3274c8a94a3af1b93e29cc54cd9"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.876086 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.878151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" event={"ID":"2ccf4351-3ae2-432d-ae11-1a07dab689ae","Type":"ContainerStarted","Data":"701fece5b6ad4c05b2c286214576f40b32d932917784d2eec4f9d4a844d361ff"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.878296 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.880893 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" event={"ID":"247a518a-17e9-482b-bab7-832b31fa99e1","Type":"ContainerStarted","Data":"379157dd9f9ff045f79e0da8f591b6691d4266c9c60f2b0d0da330771f02bfb7"} Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.881247 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.920424 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" podStartSLOduration=4.683788444 podStartE2EDuration="31.920396094s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.318172327 +0000 UTC m=+1041.079496738" lastFinishedPulling="2025-12-03 23:58:52.554779967 +0000 UTC m=+1068.316104388" observedRunningTime="2025-12-03 23:58:54.898373634 +0000 UTC m=+1070.659698085" watchObservedRunningTime="2025-12-03 23:58:54.920396094 +0000 UTC m=+1070.681720515" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.925981 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" podStartSLOduration=4.688649933 podStartE2EDuration="31.92596998s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.316211309 +0000 UTC m=+1041.077535720" lastFinishedPulling="2025-12-03 23:58:52.553531346 +0000 UTC m=+1068.314855767" observedRunningTime="2025-12-03 23:58:54.914250313 +0000 UTC m=+1070.675574754" watchObservedRunningTime="2025-12-03 23:58:54.92596998 +0000 UTC m=+1070.687294401" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.950691 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" podStartSLOduration=4.719303075 podStartE2EDuration="31.950673496s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.323861137 +0000 UTC m=+1041.085185548" lastFinishedPulling="2025-12-03 23:58:52.555231538 +0000 UTC m=+1068.316555969" observedRunningTime="2025-12-03 23:58:54.943693545 +0000 UTC m=+1070.705017966" watchObservedRunningTime="2025-12-03 23:58:54.950673496 +0000 UTC m=+1070.711997897" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.963678 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" podStartSLOduration=4.721918089 podStartE2EDuration="31.963660824s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.313215136 +0000 UTC m=+1041.074539547" lastFinishedPulling="2025-12-03 23:58:52.554957871 +0000 UTC m=+1068.316282282" observedRunningTime="2025-12-03 23:58:54.958541559 +0000 UTC m=+1070.719865970" watchObservedRunningTime="2025-12-03 23:58:54.963660824 +0000 UTC m=+1070.724985235" Dec 03 23:58:54 crc kubenswrapper[4764]: I1203 23:58:54.974850 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-7kt8t" podStartSLOduration=18.706649442 podStartE2EDuration="31.974829078s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:24.527120013 +0000 UTC m=+1040.288444424" lastFinishedPulling="2025-12-03 23:58:37.795299649 +0000 UTC m=+1053.556624060" observedRunningTime="2025-12-03 23:58:54.974205882 +0000 UTC m=+1070.735530303" watchObservedRunningTime="2025-12-03 23:58:54.974829078 +0000 UTC m=+1070.736153489" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.246447 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.251930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e494874-aa22-4cbb-aef2-a20b3ad2eea3-cert\") pod \"infra-operator-controller-manager-57548d458d-nt8f5\" (UID: \"4e494874-aa22-4cbb-aef2-a20b3ad2eea3\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.427794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.651052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.656299 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f18b5092-5d70-482a-af1f-be661a68701e-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686p2kgq\" (UID: \"f18b5092-5d70-482a-af1f-be661a68701e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.920329 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.950352 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5"] Dec 03 23:58:55 crc kubenswrapper[4764]: W1203 23:58:55.956385 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e494874_aa22_4cbb_aef2_a20b3ad2eea3.slice/crio-724f12d1ac40f492821670c41218f842edae4533054ea3e4dfffb7f947afce6f WatchSource:0}: Error finding container 724f12d1ac40f492821670c41218f842edae4533054ea3e4dfffb7f947afce6f: Status 404 returned error can't find the container with id 724f12d1ac40f492821670c41218f842edae4533054ea3e4dfffb7f947afce6f Dec 03 23:58:55 crc kubenswrapper[4764]: I1203 23:58:55.960363 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.059584 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.068526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7c88c833-c710-44c7-9bfb-a684a7f39c39-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-6ssgm\" (UID: \"7c88c833-c710-44c7-9bfb-a684a7f39c39\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.354143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.380795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq"] Dec 03 23:58:56 crc kubenswrapper[4764]: W1203 23:58:56.387664 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf18b5092_5d70_482a_af1f_be661a68701e.slice/crio-edbade773e85ff5d0ed4a84f9f8b2fc5853cff435addffd5b4f74adc57d53fc1 WatchSource:0}: Error finding container edbade773e85ff5d0ed4a84f9f8b2fc5853cff435addffd5b4f74adc57d53fc1: Status 404 returned error can't find the container with id edbade773e85ff5d0ed4a84f9f8b2fc5853cff435addffd5b4f74adc57d53fc1 Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.615984 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm"] Dec 03 23:58:56 crc kubenswrapper[4764]: W1203 23:58:56.620249 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c88c833_c710_44c7_9bfb_a684a7f39c39.slice/crio-6a3f747a3d482c7f106508aff9bc4b0f5bb7d7e2e5cc1ec3e051a983cdd5be01 WatchSource:0}: Error finding container 6a3f747a3d482c7f106508aff9bc4b0f5bb7d7e2e5cc1ec3e051a983cdd5be01: Status 404 returned error can't find the container with id 6a3f747a3d482c7f106508aff9bc4b0f5bb7d7e2e5cc1ec3e051a983cdd5be01 Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.903410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" event={"ID":"7c88c833-c710-44c7-9bfb-a684a7f39c39","Type":"ContainerStarted","Data":"1893048b8273a9b1748a47e1aa3dbe8b889dc30407ad3c0b91c68fe2ac9e9bfe"} Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.903473 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" event={"ID":"7c88c833-c710-44c7-9bfb-a684a7f39c39","Type":"ContainerStarted","Data":"6a3f747a3d482c7f106508aff9bc4b0f5bb7d7e2e5cc1ec3e051a983cdd5be01"} Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.903508 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.906862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" event={"ID":"f18b5092-5d70-482a-af1f-be661a68701e","Type":"ContainerStarted","Data":"edbade773e85ff5d0ed4a84f9f8b2fc5853cff435addffd5b4f74adc57d53fc1"} Dec 03 23:58:56 crc kubenswrapper[4764]: I1203 23:58:56.908273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" event={"ID":"4e494874-aa22-4cbb-aef2-a20b3ad2eea3","Type":"ContainerStarted","Data":"724f12d1ac40f492821670c41218f842edae4533054ea3e4dfffb7f947afce6f"} Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.923702 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" event={"ID":"4882c949-7a46-408b-a5ee-fc0fdcc6b291","Type":"ContainerStarted","Data":"2bd4e57652ca6552517c5cf3cbc034e2631c5b44ef1ce2b6cceaf8e7a8a2a6cf"} Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.942622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" event={"ID":"37b7d1f8-a668-44e9-af8d-0be7555bf2f6","Type":"ContainerStarted","Data":"5ea2e06cb033a141923b44ec3a8d56d16fe3845d99a3ca2763199419f8f35e60"} Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.944470 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" podStartSLOduration=34.944449597 podStartE2EDuration="34.944449597s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:58:56.938399844 +0000 UTC m=+1072.699724275" watchObservedRunningTime="2025-12-03 23:58:57.944449597 +0000 UTC m=+1073.705774008" Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.946975 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-q2lz5" podStartSLOduration=22.299416461 podStartE2EDuration="34.946965678s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.113789429 +0000 UTC m=+1040.875113840" lastFinishedPulling="2025-12-03 23:58:37.761338646 +0000 UTC m=+1053.522663057" observedRunningTime="2025-12-03 23:58:57.941660498 +0000 UTC m=+1073.702984909" watchObservedRunningTime="2025-12-03 23:58:57.946965678 +0000 UTC m=+1073.708290089" Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.947831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" event={"ID":"4e494874-aa22-4cbb-aef2-a20b3ad2eea3","Type":"ContainerStarted","Data":"47a381fe8047cf907e92fb959ae2074459a29b25ba346fa82d113d64760cfd09"} Dec 03 23:58:57 crc kubenswrapper[4764]: I1203 23:58:57.972448 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-nt8s2" podStartSLOduration=22.321439151 podStartE2EDuration="34.972429292s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:25.145369893 +0000 UTC m=+1040.906694314" lastFinishedPulling="2025-12-03 23:58:37.796360004 +0000 UTC m=+1053.557684455" observedRunningTime="2025-12-03 23:58:57.969239564 +0000 UTC m=+1073.730563985" watchObservedRunningTime="2025-12-03 23:58:57.972429292 +0000 UTC m=+1073.733753703" Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.962992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" event={"ID":"f18b5092-5d70-482a-af1f-be661a68701e","Type":"ContainerStarted","Data":"a96584929db85ab6cee0e98beda37c0bf72086052a1fb3de50fe5f10ac914b17"} Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.963390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" event={"ID":"f18b5092-5d70-482a-af1f-be661a68701e","Type":"ContainerStarted","Data":"344ecbc4e7204957de5d9cce23661c3f5004312088ef24462ada3fb0bed35c97"} Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.964673 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.968039 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" event={"ID":"4e494874-aa22-4cbb-aef2-a20b3ad2eea3","Type":"ContainerStarted","Data":"ab8afb8557bbce4b9bf5c6d398ab22797af6e1988aec15a728b97b242c176701"} Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.968187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:58:59 crc kubenswrapper[4764]: I1203 23:58:59.989961 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" podStartSLOduration=34.735024415 podStartE2EDuration="36.98994185s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:56.395353517 +0000 UTC m=+1072.156677968" lastFinishedPulling="2025-12-03 23:58:58.650270992 +0000 UTC m=+1074.411595403" observedRunningTime="2025-12-03 23:58:59.988576186 +0000 UTC m=+1075.749900617" watchObservedRunningTime="2025-12-03 23:58:59.98994185 +0000 UTC m=+1075.751266281" Dec 03 23:59:00 crc kubenswrapper[4764]: I1203 23:59:00.027504 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" podStartSLOduration=35.28657715 podStartE2EDuration="37.02748315s" podCreationTimestamp="2025-12-03 23:58:23 +0000 UTC" firstStartedPulling="2025-12-03 23:58:55.959966698 +0000 UTC m=+1071.721291119" lastFinishedPulling="2025-12-03 23:58:57.700872708 +0000 UTC m=+1073.462197119" observedRunningTime="2025-12-03 23:59:00.010275958 +0000 UTC m=+1075.771600379" watchObservedRunningTime="2025-12-03 23:59:00.02748315 +0000 UTC m=+1075.788807581" Dec 03 23:59:04 crc kubenswrapper[4764]: I1203 23:59:04.018749 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-975dt" Dec 03 23:59:04 crc kubenswrapper[4764]: I1203 23:59:04.098631 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2hmdx" Dec 03 23:59:04 crc kubenswrapper[4764]: I1203 23:59:04.212133 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-vzgmn" Dec 03 23:59:04 crc kubenswrapper[4764]: I1203 23:59:04.407738 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-gd8dc" Dec 03 23:59:04 crc kubenswrapper[4764]: I1203 23:59:04.428237 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-s2rm5" Dec 03 23:59:05 crc kubenswrapper[4764]: I1203 23:59:05.436532 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-nt8f5" Dec 03 23:59:05 crc kubenswrapper[4764]: I1203 23:59:05.931094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686p2kgq" Dec 03 23:59:06 crc kubenswrapper[4764]: I1203 23:59:06.364755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-6ssgm" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.503357 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.518020 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.519682 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.520101 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.520840 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.523096 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.525191 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-dk9dd" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.569824 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.571010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.575139 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.576218 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.576286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.576309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.576323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbmq6\" (UniqueName: \"kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.576353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptmqz\" (UniqueName: \"kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.584967 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.677482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.677529 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.677546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbmq6\" (UniqueName: \"kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.677578 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptmqz\" (UniqueName: \"kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.677645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.678402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.678551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.679087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.695647 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbmq6\" (UniqueName: \"kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6\") pod \"dnsmasq-dns-5cd484bb89-h4kzr\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.697118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptmqz\" (UniqueName: \"kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz\") pod \"dnsmasq-dns-567c455747-vlqp6\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.838711 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:22 crc kubenswrapper[4764]: I1203 23:59:22.888402 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:23 crc kubenswrapper[4764]: I1203 23:59:23.166970 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:23 crc kubenswrapper[4764]: I1203 23:59:23.355213 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:24 crc kubenswrapper[4764]: I1203 23:59:24.178487 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-vlqp6" event={"ID":"b723e904-fceb-4071-ac70-9e18f7b1ee24","Type":"ContainerStarted","Data":"3806e0a366e7d5530c02d1214b5a964e6ef4bd42b9a0c02048220b01de3e84a6"} Dec 03 23:59:24 crc kubenswrapper[4764]: I1203 23:59:24.180543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" event={"ID":"c4311a0b-3468-40bc-8cb0-a2e86e611df3","Type":"ContainerStarted","Data":"2e56e6bb8b3c8c1546dd2846b657495b5b5cadd8f0a38cc9a47b6967b05f325f"} Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.375323 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.394996 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.396165 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.423002 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.519749 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.519841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4v4x\" (UniqueName: \"kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.519868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.621440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4v4x\" (UniqueName: \"kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.621684 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.621751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.622564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.622580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.670422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4v4x\" (UniqueName: \"kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x\") pod \"dnsmasq-dns-bc4b48fc9-r9m9v\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.684170 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.714048 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.718648 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.722682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.748202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.826944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.827022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxcw\" (UniqueName: \"kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.827051 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.928946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxcw\" (UniqueName: \"kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.930388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.930504 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.931364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.934772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:25 crc kubenswrapper[4764]: I1203 23:59:25.958657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxcw\" (UniqueName: \"kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw\") pod \"dnsmasq-dns-cb666b895-bc2xr\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.048769 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.221519 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.496827 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 03 23:59:26 crc kubenswrapper[4764]: W1203 23:59:26.499378 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1378ad3a_99d0_47e1_8c66_74a67341e30d.slice/crio-62709bbc819ce788c1ac91e657496bf29833bdb4cd3472957e498526c771d492 WatchSource:0}: Error finding container 62709bbc819ce788c1ac91e657496bf29833bdb4cd3472957e498526c771d492: Status 404 returned error can't find the container with id 62709bbc819ce788c1ac91e657496bf29833bdb4cd3472957e498526c771d492 Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.562241 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.566535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.567085 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.569678 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-7hrdv" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.570625 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.570694 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.571359 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.571501 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.571607 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.571730 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741293 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741368 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741418 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldxds\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.741820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.879808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.879917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.879959 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.879979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.880002 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.880025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.880056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.880085 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.880254 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.883022 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.884023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.884265 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.884689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.885497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.885621 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.885649 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldxds\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.885690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.887036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.889638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.889641 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.891639 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d8jll" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.891754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.891900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.892192 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.892225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.893753 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.894304 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.894539 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.894765 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.915874 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.916087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldxds\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.916133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.916581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " pod="openstack/rabbitmq-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992629 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992765 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvl4\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:26 crc kubenswrapper[4764]: I1203 23:59:26.992931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.093950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094022 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094055 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvl4\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094200 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.094682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.095018 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.095038 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.095335 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.095496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.095591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.096600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.101398 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.131457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.132128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.135354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.140863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.150976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvl4\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.202146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.238065 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" event={"ID":"1378ad3a-99d0-47e1-8c66-74a67341e30d","Type":"ContainerStarted","Data":"62709bbc819ce788c1ac91e657496bf29833bdb4cd3472957e498526c771d492"} Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.239436 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" event={"ID":"78ea605b-ce13-4473-aa8c-b0358c0bc35a","Type":"ContainerStarted","Data":"5f8f5a5caebbb5475a05671fbc55a0b9cc4a3a83feffbd4331f4218aa785f6c0"} Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.267450 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.711110 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.770729 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.943921 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.945295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.948922 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.949241 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.949948 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.950514 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-6x254" Dec 03 23:59:27 crc kubenswrapper[4764]: I1203 23:59:27.951978 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.022194 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129620 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl86m\" (UniqueName: \"kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.129688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.232917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233221 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233285 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233300 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl86m\" (UniqueName: \"kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233747 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.233777 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.234445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.236293 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.237201 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.239479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.247469 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.252148 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl86m\" (UniqueName: \"kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.282820 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " pod="openstack/openstack-galera-0" Dec 03 23:59:28 crc kubenswrapper[4764]: I1203 23:59:28.333796 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.478150 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.486102 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.491702 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.491881 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.491921 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.491938 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.492918 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-szqqz" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659658 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78mv\" (UniqueName: \"kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.659819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.734272 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.739206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.740773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.752226 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.752353 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-2zvpx" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.752391 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761258 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l78mv\" (UniqueName: \"kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761308 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.761356 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.763078 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.763367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.763655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.763896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.776968 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.783325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.791188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.791473 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78mv\" (UniqueName: \"kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.812436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.816487 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.862439 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.862475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gxhs\" (UniqueName: \"kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.862502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.862536 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.862554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.964230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.964499 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gxhs\" (UniqueName: \"kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.964526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.964566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.964588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.965210 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.965892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.968306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.969585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:29 crc kubenswrapper[4764]: I1203 23:59:29.979361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gxhs\" (UniqueName: \"kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs\") pod \"memcached-0\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " pod="openstack/memcached-0" Dec 03 23:59:30 crc kubenswrapper[4764]: I1203 23:59:30.149468 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 23:59:31 crc kubenswrapper[4764]: W1203 23:59:31.189898 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76708e9b_1db4_42ca_94d2_7ff96d08d855.slice/crio-79b05697b0109953e848aa512e8920f7fd0c8e0fec85799274334d68e47f1f37 WatchSource:0}: Error finding container 79b05697b0109953e848aa512e8920f7fd0c8e0fec85799274334d68e47f1f37: Status 404 returned error can't find the container with id 79b05697b0109953e848aa512e8920f7fd0c8e0fec85799274334d68e47f1f37 Dec 03 23:59:31 crc kubenswrapper[4764]: W1203 23:59:31.194700 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda43f61_31ae_4c4c_967e_f0e8d13f5ae9.slice/crio-2fb8fe5fe4b5dd4640b6f9d111a943406eeb12c609f69dcc9c26a2e78dd0f3f3 WatchSource:0}: Error finding container 2fb8fe5fe4b5dd4640b6f9d111a943406eeb12c609f69dcc9c26a2e78dd0f3f3: Status 404 returned error can't find the container with id 2fb8fe5fe4b5dd4640b6f9d111a943406eeb12c609f69dcc9c26a2e78dd0f3f3 Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.259637 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.260695 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.262752 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mlcc9" Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.275796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerStarted","Data":"79b05697b0109953e848aa512e8920f7fd0c8e0fec85799274334d68e47f1f37"} Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.276978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerStarted","Data":"2fb8fe5fe4b5dd4640b6f9d111a943406eeb12c609f69dcc9c26a2e78dd0f3f3"} Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.312409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.388452 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h57rf\" (UniqueName: \"kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf\") pod \"kube-state-metrics-0\" (UID: \"660b3b11-42db-456f-997b-250a9120afc9\") " pod="openstack/kube-state-metrics-0" Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.489825 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h57rf\" (UniqueName: \"kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf\") pod \"kube-state-metrics-0\" (UID: \"660b3b11-42db-456f-997b-250a9120afc9\") " pod="openstack/kube-state-metrics-0" Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.517005 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h57rf\" (UniqueName: \"kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf\") pod \"kube-state-metrics-0\" (UID: \"660b3b11-42db-456f-997b-250a9120afc9\") " pod="openstack/kube-state-metrics-0" Dec 03 23:59:31 crc kubenswrapper[4764]: I1203 23:59:31.580752 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.645955 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.652829 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.655857 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qmm4g" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.656158 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.656234 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.657465 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.666782 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.673203 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782753 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782891 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4srwm\" (UniqueName: \"kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.782995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.783022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.871519 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.872595 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.877578 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.877904 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9svq5" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.878009 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.882275 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4srwm\" (UniqueName: \"kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884930 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.884970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.885640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.886062 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.886686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.885317 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.887414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.887632 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.899786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.901448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.901952 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.912986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4srwm\" (UniqueName: \"kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.915458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.916331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2fbx\" (UniqueName: \"kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987578 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987771 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7tbw\" (UniqueName: \"kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987796 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.987968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:36 crc kubenswrapper[4764]: I1203 23:59:36.998993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.088945 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.088990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089011 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089062 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2fbx\" (UniqueName: \"kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7tbw\" (UniqueName: \"kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089199 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089929 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.089976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.090116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.090270 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.090443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.092143 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.096882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.098116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.104403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.104638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2fbx\" (UniqueName: \"kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx\") pod \"ovn-controller-l2vv9\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.109097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7tbw\" (UniqueName: \"kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw\") pod \"ovn-controller-ovs-xnsqq\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.253332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:37 crc kubenswrapper[4764]: I1203 23:59:37.259301 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.695104 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.696311 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.698056 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.702562 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c2hs5" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.702851 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.703609 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.706638 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813484 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813694 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.813730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.814607 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.814655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jh7\" (UniqueName: \"kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.915590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.915862 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.915892 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.915963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.916000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jh7\" (UniqueName: \"kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.916053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.916072 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.916094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.916420 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.917397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.917611 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.917867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.921595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.925147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.925700 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.934809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jh7\" (UniqueName: \"kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:38 crc kubenswrapper[4764]: I1203 23:59:38.940089 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:39 crc kubenswrapper[4764]: I1203 23:59:39.051023 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.484751 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.485208 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbmq6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5cd484bb89-h4kzr_openstack(c4311a0b-3468-40bc-8cb0-a2e86e611df3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.487043 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" podUID="c4311a0b-3468-40bc-8cb0-a2e86e611df3" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.501040 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.501187 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:42f5663a161307156673f86e5eaad59f842a4bf25824f48008e69ab18e4ba792,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptmqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-567c455747-vlqp6_openstack(b723e904-fceb-4071-ac70-9e18f7b1ee24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 23:59:40 crc kubenswrapper[4764]: E1203 23:59:40.502440 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-567c455747-vlqp6" podUID="b723e904-fceb-4071-ac70-9e18f7b1ee24" Dec 03 23:59:40 crc kubenswrapper[4764]: I1203 23:59:40.946217 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 23:59:41 crc kubenswrapper[4764]: W1203 23:59:41.511992 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod662de035_d0f1_4a65_98ad_161d6f21bd26.slice/crio-91e20ac143ec056870b8d11cd6b57efb90705751500ca056d7496ec367fef543 WatchSource:0}: Error finding container 91e20ac143ec056870b8d11cd6b57efb90705751500ca056d7496ec367fef543: Status 404 returned error can't find the container with id 91e20ac143ec056870b8d11cd6b57efb90705751500ca056d7496ec367fef543 Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.731058 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.733433 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.871952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc\") pod \"b723e904-fceb-4071-ac70-9e18f7b1ee24\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.872031 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptmqz\" (UniqueName: \"kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz\") pod \"b723e904-fceb-4071-ac70-9e18f7b1ee24\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.872137 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config\") pod \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.872193 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config\") pod \"b723e904-fceb-4071-ac70-9e18f7b1ee24\" (UID: \"b723e904-fceb-4071-ac70-9e18f7b1ee24\") " Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.872221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbmq6\" (UniqueName: \"kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6\") pod \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\" (UID: \"c4311a0b-3468-40bc-8cb0-a2e86e611df3\") " Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.872790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config" (OuterVolumeSpecName: "config") pod "c4311a0b-3468-40bc-8cb0-a2e86e611df3" (UID: "c4311a0b-3468-40bc-8cb0-a2e86e611df3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.873155 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b723e904-fceb-4071-ac70-9e18f7b1ee24" (UID: "b723e904-fceb-4071-ac70-9e18f7b1ee24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.876294 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config" (OuterVolumeSpecName: "config") pod "b723e904-fceb-4071-ac70-9e18f7b1ee24" (UID: "b723e904-fceb-4071-ac70-9e18f7b1ee24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.968024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6" (OuterVolumeSpecName: "kube-api-access-wbmq6") pod "c4311a0b-3468-40bc-8cb0-a2e86e611df3" (UID: "c4311a0b-3468-40bc-8cb0-a2e86e611df3"). InnerVolumeSpecName "kube-api-access-wbmq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.969801 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz" (OuterVolumeSpecName: "kube-api-access-ptmqz") pod "b723e904-fceb-4071-ac70-9e18f7b1ee24" (UID: "b723e904-fceb-4071-ac70-9e18f7b1ee24"). InnerVolumeSpecName "kube-api-access-ptmqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.973949 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.973971 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptmqz\" (UniqueName: \"kubernetes.io/projected/b723e904-fceb-4071-ac70-9e18f7b1ee24-kube-api-access-ptmqz\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.973983 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4311a0b-3468-40bc-8cb0-a2e86e611df3-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.973991 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b723e904-fceb-4071-ac70-9e18f7b1ee24-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:41 crc kubenswrapper[4764]: I1203 23:59:41.973999 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbmq6\" (UniqueName: \"kubernetes.io/projected/c4311a0b-3468-40bc-8cb0-a2e86e611df3-kube-api-access-wbmq6\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.084364 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 23:59:42 crc kubenswrapper[4764]: W1203 23:59:42.098144 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8586564_9024_4375_a5f7_e75844abe723.slice/crio-eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2 WatchSource:0}: Error finding container eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2: Status 404 returned error can't find the container with id eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2 Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.204277 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 23:59:42 crc kubenswrapper[4764]: W1203 23:59:42.220017 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6160ab00_1691_41f8_9902_80d33e435770.slice/crio-62a1bed6b510cae08735450564fad85c90c569d930b22c9100c19e60d9ad3c00 WatchSource:0}: Error finding container 62a1bed6b510cae08735450564fad85c90c569d930b22c9100c19e60d9ad3c00: Status 404 returned error can't find the container with id 62a1bed6b510cae08735450564fad85c90c569d930b22c9100c19e60d9ad3c00 Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.303857 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.314637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.322682 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 03 23:59:42 crc kubenswrapper[4764]: W1203 23:59:42.368258 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod660b3b11_42db_456f_997b_250a9120afc9.slice/crio-d479e4bb68de2ef7ac39b558e4900b23137e82cb2b705d4b7bcc075dd672d07f WatchSource:0}: Error finding container d479e4bb68de2ef7ac39b558e4900b23137e82cb2b705d4b7bcc075dd672d07f: Status 404 returned error can't find the container with id d479e4bb68de2ef7ac39b558e4900b23137e82cb2b705d4b7bcc075dd672d07f Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.380234 4764 generic.go:334] "Generic (PLEG): container finished" podID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerID="971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998" exitCode=0 Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.380312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" event={"ID":"1378ad3a-99d0-47e1-8c66-74a67341e30d","Type":"ContainerDied","Data":"971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.383076 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"662de035-d0f1-4a65-98ad-161d6f21bd26","Type":"ContainerStarted","Data":"91e20ac143ec056870b8d11cd6b57efb90705751500ca056d7496ec367fef543"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.384104 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-vlqp6" Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.384164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-vlqp6" event={"ID":"b723e904-fceb-4071-ac70-9e18f7b1ee24","Type":"ContainerDied","Data":"3806e0a366e7d5530c02d1214b5a964e6ef4bd42b9a0c02048220b01de3e84a6"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.385377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerStarted","Data":"62a1bed6b510cae08735450564fad85c90c569d930b22c9100c19e60d9ad3c00"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.386594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" event={"ID":"c4311a0b-3468-40bc-8cb0-a2e86e611df3","Type":"ContainerDied","Data":"2e56e6bb8b3c8c1546dd2846b657495b5b5cadd8f0a38cc9a47b6967b05f325f"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.386631 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-h4kzr" Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.387948 4764 generic.go:334] "Generic (PLEG): container finished" podID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerID="f87a5fed88238b272c643b2facc8e8828c94a2f1a564b73e85593e80bb39a8fb" exitCode=0 Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.388050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" event={"ID":"78ea605b-ce13-4473-aa8c-b0358c0bc35a","Type":"ContainerDied","Data":"f87a5fed88238b272c643b2facc8e8828c94a2f1a564b73e85593e80bb39a8fb"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.388757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerStarted","Data":"eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2"} Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.467382 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:42 crc kubenswrapper[4764]: W1203 23:59:42.467985 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod249c3a6b_9345_49ed_9b2d_a0991fb02dc0.slice/crio-115299cd95d9c0ccbefe3eb31bf8c5ba467cfd3f82d8ea997f1fed3d53c0c123 WatchSource:0}: Error finding container 115299cd95d9c0ccbefe3eb31bf8c5ba467cfd3f82d8ea997f1fed3d53c0c123: Status 404 returned error can't find the container with id 115299cd95d9c0ccbefe3eb31bf8c5ba467cfd3f82d8ea997f1fed3d53c0c123 Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.476634 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-vlqp6"] Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.500376 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.505650 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-h4kzr"] Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.555325 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b723e904-fceb-4071-ac70-9e18f7b1ee24" path="/var/lib/kubelet/pods/b723e904-fceb-4071-ac70-9e18f7b1ee24/volumes" Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.555901 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4311a0b-3468-40bc-8cb0-a2e86e611df3" path="/var/lib/kubelet/pods/c4311a0b-3468-40bc-8cb0-a2e86e611df3/volumes" Dec 03 23:59:42 crc kubenswrapper[4764]: W1203 23:59:42.666753 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab264d6c_eecf_496f_b505_39b128dd8e44.slice/crio-e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc WatchSource:0}: Error finding container e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc: Status 404 returned error can't find the container with id e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc Dec 03 23:59:42 crc kubenswrapper[4764]: I1203 23:59:42.949803 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.062990 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.399377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9" event={"ID":"ab264d6c-eecf-496f-b505-39b128dd8e44","Type":"ContainerStarted","Data":"e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc"} Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.401138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerStarted","Data":"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11"} Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.403219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerStarted","Data":"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0"} Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.406023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"660b3b11-42db-456f-997b-250a9120afc9","Type":"ContainerStarted","Data":"d479e4bb68de2ef7ac39b558e4900b23137e82cb2b705d4b7bcc075dd672d07f"} Dec 03 23:59:43 crc kubenswrapper[4764]: I1203 23:59:43.407154 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerStarted","Data":"115299cd95d9c0ccbefe3eb31bf8c5ba467cfd3f82d8ea997f1fed3d53c0c123"} Dec 03 23:59:43 crc kubenswrapper[4764]: W1203 23:59:43.726459 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f850034_7f6e_4811_b98f_89648c559dcd.slice/crio-de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5 WatchSource:0}: Error finding container de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5: Status 404 returned error can't find the container with id de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5 Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.415956 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerStarted","Data":"0bb1361cc63c0a90e857adf49e76d7b94911883caf7741f4d64321925645a31c"} Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.417889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" event={"ID":"78ea605b-ce13-4473-aa8c-b0358c0bc35a","Type":"ContainerStarted","Data":"bf4c459a477fedcbb2f3c6acbfb4d419534bff3bd0ea77642f763c7c1cd0032c"} Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.418043 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.419693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" event={"ID":"1378ad3a-99d0-47e1-8c66-74a67341e30d","Type":"ContainerStarted","Data":"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119"} Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.419837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.421367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"662de035-d0f1-4a65-98ad-161d6f21bd26","Type":"ContainerStarted","Data":"ac966fb4e19027f88b3b69616fbd7358921c916973249c52a6951d8a77e62d9f"} Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.421507 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.422860 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerStarted","Data":"de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5"} Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.465445 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" podStartSLOduration=4.121782636 podStartE2EDuration="19.465427048s" podCreationTimestamp="2025-12-03 23:59:25 +0000 UTC" firstStartedPulling="2025-12-03 23:59:26.248827522 +0000 UTC m=+1102.010151933" lastFinishedPulling="2025-12-03 23:59:41.592471934 +0000 UTC m=+1117.353796345" observedRunningTime="2025-12-03 23:59:44.439937592 +0000 UTC m=+1120.201262023" watchObservedRunningTime="2025-12-03 23:59:44.465427048 +0000 UTC m=+1120.226751449" Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.465767 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.218761945 podStartE2EDuration="15.465762776s" podCreationTimestamp="2025-12-03 23:59:29 +0000 UTC" firstStartedPulling="2025-12-03 23:59:41.552670748 +0000 UTC m=+1117.313995159" lastFinishedPulling="2025-12-03 23:59:43.799671579 +0000 UTC m=+1119.560995990" observedRunningTime="2025-12-03 23:59:44.457757779 +0000 UTC m=+1120.219082200" watchObservedRunningTime="2025-12-03 23:59:44.465762776 +0000 UTC m=+1120.227087187" Dec 03 23:59:44 crc kubenswrapper[4764]: I1203 23:59:44.476438 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" podStartSLOduration=4.304605973 podStartE2EDuration="19.476421157s" podCreationTimestamp="2025-12-03 23:59:25 +0000 UTC" firstStartedPulling="2025-12-03 23:59:26.501650537 +0000 UTC m=+1102.262974948" lastFinishedPulling="2025-12-03 23:59:41.673465721 +0000 UTC m=+1117.434790132" observedRunningTime="2025-12-03 23:59:44.475002372 +0000 UTC m=+1120.236326783" watchObservedRunningTime="2025-12-03 23:59:44.476421157 +0000 UTC m=+1120.237745568" Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.465421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"660b3b11-42db-456f-997b-250a9120afc9","Type":"ContainerStarted","Data":"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.465983 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.468365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerStarted","Data":"5ff0a10729c25996e23b4567c55f3aef2507de83dbe93682e1eebbaeef977a9f"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.471146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9" event={"ID":"ab264d6c-eecf-496f-b505-39b128dd8e44","Type":"ContainerStarted","Data":"f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.471268 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-l2vv9" Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.474638 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerStarted","Data":"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.476608 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f850034-7f6e-4811-b98f-89648c559dcd" containerID="2d73d87901dced518fbff8a4656eb6e2c9223d98c28022fedebac00cf11bf4dc" exitCode=0 Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.476662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerDied","Data":"2d73d87901dced518fbff8a4656eb6e2c9223d98c28022fedebac00cf11bf4dc"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.479329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerStarted","Data":"848bf827a01a5b8980ad8035d235310421f22cc7cf5bb62a1d3f4398c2863351"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.480996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerStarted","Data":"eaaabf3d0d46d02c83dcc8d6773188d2b52ac585608db6c2550768e3791de973"} Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.491798 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.074044372 podStartE2EDuration="18.491772784s" podCreationTimestamp="2025-12-03 23:59:31 +0000 UTC" firstStartedPulling="2025-12-03 23:59:42.376880813 +0000 UTC m=+1118.138205244" lastFinishedPulling="2025-12-03 23:59:48.794609225 +0000 UTC m=+1124.555933656" observedRunningTime="2025-12-03 23:59:49.489466157 +0000 UTC m=+1125.250790568" watchObservedRunningTime="2025-12-03 23:59:49.491772784 +0000 UTC m=+1125.253097205" Dec 03 23:59:49 crc kubenswrapper[4764]: I1203 23:59:49.531276 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-l2vv9" podStartSLOduration=7.430444103 podStartE2EDuration="13.531262392s" podCreationTimestamp="2025-12-03 23:59:36 +0000 UTC" firstStartedPulling="2025-12-03 23:59:42.673647682 +0000 UTC m=+1118.434972093" lastFinishedPulling="2025-12-03 23:59:48.774465981 +0000 UTC m=+1124.535790382" observedRunningTime="2025-12-03 23:59:49.53117786 +0000 UTC m=+1125.292502281" watchObservedRunningTime="2025-12-03 23:59:49.531262392 +0000 UTC m=+1125.292586803" Dec 03 23:59:50 crc kubenswrapper[4764]: I1203 23:59:50.150609 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 23:59:50 crc kubenswrapper[4764]: I1203 23:59:50.490752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerStarted","Data":"2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1"} Dec 03 23:59:50 crc kubenswrapper[4764]: I1203 23:59:50.491086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerStarted","Data":"72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d"} Dec 03 23:59:50 crc kubenswrapper[4764]: I1203 23:59:50.524466 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-xnsqq" podStartSLOduration=9.552346456 podStartE2EDuration="14.524437001s" podCreationTimestamp="2025-12-03 23:59:36 +0000 UTC" firstStartedPulling="2025-12-03 23:59:43.748161886 +0000 UTC m=+1119.509486307" lastFinishedPulling="2025-12-03 23:59:48.720252441 +0000 UTC m=+1124.481576852" observedRunningTime="2025-12-03 23:59:50.517676675 +0000 UTC m=+1126.279001096" watchObservedRunningTime="2025-12-03 23:59:50.524437001 +0000 UTC m=+1126.285761432" Dec 03 23:59:50 crc kubenswrapper[4764]: I1203 23:59:50.715872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.051971 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.119094 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.496949 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="dnsmasq-dns" containerID="cri-o://bf4c459a477fedcbb2f3c6acbfb4d419534bff3bd0ea77642f763c7c1cd0032c" gracePeriod=10 Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.498067 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.498321 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-xnsqq" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.510376 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.511552 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.526024 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.563846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.563936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.563960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phx4g\" (UniqueName: \"kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.666739 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.667809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.668151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.666867 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.668618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phx4g\" (UniqueName: \"kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.690749 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phx4g\" (UniqueName: \"kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g\") pod \"dnsmasq-dns-66c567d66c-dmd7s\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:51 crc kubenswrapper[4764]: I1203 23:59:51.837192 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.504363 4764 generic.go:334] "Generic (PLEG): container finished" podID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerID="bf4c459a477fedcbb2f3c6acbfb4d419534bff3bd0ea77642f763c7c1cd0032c" exitCode=0 Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.504495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" event={"ID":"78ea605b-ce13-4473-aa8c-b0358c0bc35a","Type":"ContainerDied","Data":"bf4c459a477fedcbb2f3c6acbfb4d419534bff3bd0ea77642f763c7c1cd0032c"} Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.643459 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.648180 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.652132 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.652198 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.652389 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.652429 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xnt5n" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.678648 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.786767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.786825 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.786873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.786898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w2hg\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.787052 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.888229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.888325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.888352 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.888366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.888381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w2hg\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: E1203 23:59:52.888924 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 23:59:52 crc kubenswrapper[4764]: E1203 23:59:52.888943 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 23:59:52 crc kubenswrapper[4764]: E1203 23:59:52.888978 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-03 23:59:53.388964183 +0000 UTC m=+1129.150288594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.889120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.889343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.889373 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.911577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w2hg\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:52 crc kubenswrapper[4764]: I1203 23:59:52.952891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.255630 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-8gb4t"] Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.259403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.261532 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.264188 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.264390 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299135 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299237 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8crrj\" (UniqueName: \"kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.299396 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.310454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8gb4t"] Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.311441 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400158 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4v4x\" (UniqueName: \"kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x\") pod \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400237 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc\") pod \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config\") pod \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\" (UID: \"78ea605b-ce13-4473-aa8c-b0358c0bc35a\") " Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400641 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400711 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400774 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400819 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8crrj\" (UniqueName: \"kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400860 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.400892 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:53 crc kubenswrapper[4764]: E1203 23:59:53.401010 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 23:59:53 crc kubenswrapper[4764]: E1203 23:59:53.401024 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 23:59:53 crc kubenswrapper[4764]: E1203 23:59:53.401063 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-03 23:59:54.401049983 +0000 UTC m=+1130.162374394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.401145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.401782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.401786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.404752 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.404997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x" (OuterVolumeSpecName: "kube-api-access-l4v4x") pod "78ea605b-ce13-4473-aa8c-b0358c0bc35a" (UID: "78ea605b-ce13-4473-aa8c-b0358c0bc35a"). InnerVolumeSpecName "kube-api-access-l4v4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.405512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.406192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.421450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8crrj\" (UniqueName: \"kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj\") pod \"swift-ring-rebalance-8gb4t\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.487227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78ea605b-ce13-4473-aa8c-b0358c0bc35a" (UID: "78ea605b-ce13-4473-aa8c-b0358c0bc35a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.491969 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config" (OuterVolumeSpecName: "config") pod "78ea605b-ce13-4473-aa8c-b0358c0bc35a" (UID: "78ea605b-ce13-4473-aa8c-b0358c0bc35a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.502244 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-config\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.502273 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4v4x\" (UniqueName: \"kubernetes.io/projected/78ea605b-ce13-4473-aa8c-b0358c0bc35a-kube-api-access-l4v4x\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.502284 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78ea605b-ce13-4473-aa8c-b0358c0bc35a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.514770 4764 generic.go:334] "Generic (PLEG): container finished" podID="6160ab00-1691-41f8-9902-80d33e435770" containerID="ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d" exitCode=0 Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.514862 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerDied","Data":"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d"} Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.520571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerStarted","Data":"92cc9452c05ea4b28722f96dbdaea7af0d12e4f63960239309ced2df5f5e288d"} Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.525457 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.525462 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-r9m9v" event={"ID":"78ea605b-ce13-4473-aa8c-b0358c0bc35a","Type":"ContainerDied","Data":"5f8f5a5caebbb5475a05671fbc55a0b9cc4a3a83feffbd4331f4218aa785f6c0"} Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.525524 4764 scope.go:117] "RemoveContainer" containerID="bf4c459a477fedcbb2f3c6acbfb4d419534bff3bd0ea77642f763c7c1cd0032c" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.527220 4764 generic.go:334] "Generic (PLEG): container finished" podID="d8586564-9024-4375-a5f7-e75844abe723" containerID="eaaabf3d0d46d02c83dcc8d6773188d2b52ac585608db6c2550768e3791de973" exitCode=0 Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.527271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerDied","Data":"eaaabf3d0d46d02c83dcc8d6773188d2b52ac585608db6c2550768e3791de973"} Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.530555 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerStarted","Data":"daaf88f8c3b80eeebebe4394604c8c9052727fcd26fe8a3c91f09babde9cc83e"} Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.550163 4764 scope.go:117] "RemoveContainer" containerID="f87a5fed88238b272c643b2facc8e8828c94a2f1a564b73e85593e80bb39a8fb" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.563002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.110504682 podStartE2EDuration="18.562980664s" podCreationTimestamp="2025-12-03 23:59:35 +0000 UTC" firstStartedPulling="2025-12-03 23:59:43.774886471 +0000 UTC m=+1119.536210882" lastFinishedPulling="2025-12-03 23:59:53.227362453 +0000 UTC m=+1128.988686864" observedRunningTime="2025-12-03 23:59:53.556990028 +0000 UTC m=+1129.318314439" watchObservedRunningTime="2025-12-03 23:59:53.562980664 +0000 UTC m=+1129.324305075" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.587175 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.808169711 podStartE2EDuration="16.587156717s" podCreationTimestamp="2025-12-03 23:59:37 +0000 UTC" firstStartedPulling="2025-12-03 23:59:42.471921864 +0000 UTC m=+1118.233246275" lastFinishedPulling="2025-12-03 23:59:53.25090887 +0000 UTC m=+1129.012233281" observedRunningTime="2025-12-03 23:59:53.576285691 +0000 UTC m=+1129.337610102" watchObservedRunningTime="2025-12-03 23:59:53.587156717 +0000 UTC m=+1129.348481138" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.626444 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.630655 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gb4t" Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.632876 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-r9m9v"] Dec 03 23:59:53 crc kubenswrapper[4764]: I1203 23:59:53.658351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 03 23:59:53 crc kubenswrapper[4764]: W1203 23:59:53.681048 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3d79ab9_f767_4aa3_b30a_266b311e436a.slice/crio-ba999ba055f3758330d994465b86275c36ff8613ae5ef91adcacc324b5f4e778 WatchSource:0}: Error finding container ba999ba055f3758330d994465b86275c36ff8613ae5ef91adcacc324b5f4e778: Status 404 returned error can't find the container with id ba999ba055f3758330d994465b86275c36ff8613ae5ef91adcacc324b5f4e778 Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.052011 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.052386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:54 crc kubenswrapper[4764]: W1203 23:59:54.053513 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode19a66e3_3521_4e8d_b4ff_6a9ef5003a8a.slice/crio-135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0 WatchSource:0}: Error finding container 135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0: Status 404 returned error can't find the container with id 135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0 Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.054413 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-8gb4t"] Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.092433 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.418213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:54 crc kubenswrapper[4764]: E1203 23:59:54.418703 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 23:59:54 crc kubenswrapper[4764]: E1203 23:59:54.418823 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 23:59:54 crc kubenswrapper[4764]: E1203 23:59:54.418925 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-03 23:59:56.418893307 +0000 UTC m=+1132.180217768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.543194 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gb4t" event={"ID":"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a","Type":"ContainerStarted","Data":"135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0"} Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.550824 4764 generic.go:334] "Generic (PLEG): container finished" podID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerID="a617a91c9e3194bb193c85c5995f063860249a71530afc25099c1200bd38ee3a" exitCode=0 Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.570388 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" path="/var/lib/kubelet/pods/78ea605b-ce13-4473-aa8c-b0358c0bc35a/volumes" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.572359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerStarted","Data":"40932f1f044feae057b1145cd8eb76e3370493aa44a7c5f0f8b568439dbde7ab"} Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.572437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" event={"ID":"b3d79ab9-f767-4aa3-b30a-266b311e436a","Type":"ContainerDied","Data":"a617a91c9e3194bb193c85c5995f063860249a71530afc25099c1200bd38ee3a"} Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.572479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" event={"ID":"b3d79ab9-f767-4aa3-b30a-266b311e436a","Type":"ContainerStarted","Data":"ba999ba055f3758330d994465b86275c36ff8613ae5ef91adcacc324b5f4e778"} Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.572505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerStarted","Data":"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4"} Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.638762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.642604 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=22.573898381 podStartE2EDuration="28.642581753s" podCreationTimestamp="2025-12-03 23:59:26 +0000 UTC" firstStartedPulling="2025-12-03 23:59:42.100280399 +0000 UTC m=+1117.861604810" lastFinishedPulling="2025-12-03 23:59:48.168963781 +0000 UTC m=+1123.930288182" observedRunningTime="2025-12-03 23:59:54.639068947 +0000 UTC m=+1130.400393368" watchObservedRunningTime="2025-12-03 23:59:54.642581753 +0000 UTC m=+1130.403906174" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.702427 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.600657459 podStartE2EDuration="26.702409051s" podCreationTimestamp="2025-12-03 23:59:28 +0000 UTC" firstStartedPulling="2025-12-03 23:59:42.225256144 +0000 UTC m=+1117.986580575" lastFinishedPulling="2025-12-03 23:59:48.327007756 +0000 UTC m=+1124.088332167" observedRunningTime="2025-12-03 23:59:54.696862874 +0000 UTC m=+1130.458187275" watchObservedRunningTime="2025-12-03 23:59:54.702409051 +0000 UTC m=+1130.463733462" Dec 03 23:59:54 crc kubenswrapper[4764]: I1203 23:59:54.999401 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.018056 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.076710 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 03 23:59:55 crc kubenswrapper[4764]: E1203 23:59:55.077210 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="dnsmasq-dns" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.077276 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="dnsmasq-dns" Dec 03 23:59:55 crc kubenswrapper[4764]: E1203 23:59:55.077336 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="init" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.077387 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="init" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.077620 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="78ea605b-ce13-4473-aa8c-b0358c0bc35a" containerName="dnsmasq-dns" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.078465 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.082262 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.132652 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.138931 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.238438 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.238485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.238515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.238593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.297628 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.298629 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.306061 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.320019 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.340155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.340202 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.340283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.340341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.341087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.341661 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.341674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.358519 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95\") pod \"dnsmasq-dns-555449b67f-kdvzw\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.394845 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441657 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441705 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441781 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mgj\" (UniqueName: \"kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.441828 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543413 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543819 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543854 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7mgj\" (UniqueName: \"kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.543999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.544395 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.547842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.552553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.566439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7mgj\" (UniqueName: \"kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj\") pod \"ovn-controller-metrics-8zvg9\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.570060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" event={"ID":"b3d79ab9-f767-4aa3-b30a-266b311e436a","Type":"ContainerStarted","Data":"2587697e13c5936647c92355e876fd171d3d445c9413414ef2cd41a2a84d46fd"} Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.570787 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.570809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.611153 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.615390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8zvg9" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.623280 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" podStartSLOduration=4.623257715 podStartE2EDuration="4.623257715s" podCreationTimestamp="2025-12-03 23:59:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 23:59:55.590885081 +0000 UTC m=+1131.352209492" watchObservedRunningTime="2025-12-03 23:59:55.623257715 +0000 UTC m=+1131.384582126" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.669404 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.670730 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.674328 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.682261 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.709977 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.749673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbkq\" (UniqueName: \"kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.749703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.749793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.749839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.749957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851369 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbkq\" (UniqueName: \"kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851512 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.851857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.852418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.857478 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.857624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.876699 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbkq\" (UniqueName: \"kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq\") pod \"dnsmasq-dns-784d65c867-dnn2v\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.906914 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.908482 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.910970 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.914282 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.914424 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.914534 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qzthq" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.924174 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.940122 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952564 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952771 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw9x2\" (UniqueName: \"kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.952808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:55 crc kubenswrapper[4764]: I1203 23:59:55.953028 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.005975 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055334 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055513 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw9x2\" (UniqueName: \"kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.055605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.057625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.059035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.059926 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.060057 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.060633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.064272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.076765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw9x2\" (UniqueName: \"kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2\") pod \"ovn-northd-0\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.218228 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.241059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.461527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 03 23:59:56 crc kubenswrapper[4764]: E1203 23:59:56.461755 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 23:59:56 crc kubenswrapper[4764]: E1203 23:59:56.461784 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 23:59:56 crc kubenswrapper[4764]: E1203 23:59:56.461872 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-04 00:00:00.461852432 +0000 UTC m=+1136.223176853 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.529079 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.592252 4764 generic.go:334] "Generic (PLEG): container finished" podID="28e9b357-81d0-4562-8796-f431986ea655" containerID="6f06074823e5c7bbd044c5cc25538a5709108a2ce7c2d76232f4d8c4dc05e57d" exitCode=0 Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.592736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" event={"ID":"28e9b357-81d0-4562-8796-f431986ea655","Type":"ContainerDied","Data":"6f06074823e5c7bbd044c5cc25538a5709108a2ce7c2d76232f4d8c4dc05e57d"} Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.592779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" event={"ID":"28e9b357-81d0-4562-8796-f431986ea655","Type":"ContainerStarted","Data":"e5df4030a6c11bf3b6c197a88b8190230f200694784947433a7b818e0df701ec"} Dec 03 23:59:56 crc kubenswrapper[4764]: I1203 23:59:56.593088 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="dnsmasq-dns" containerID="cri-o://2587697e13c5936647c92355e876fd171d3d445c9413414ef2cd41a2a84d46fd" gracePeriod=10 Dec 03 23:59:57 crc kubenswrapper[4764]: I1203 23:59:57.601210 4764 generic.go:334] "Generic (PLEG): container finished" podID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerID="2587697e13c5936647c92355e876fd171d3d445c9413414ef2cd41a2a84d46fd" exitCode=0 Dec 03 23:59:57 crc kubenswrapper[4764]: I1203 23:59:57.601297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" event={"ID":"b3d79ab9-f767-4aa3-b30a-266b311e436a","Type":"ContainerDied","Data":"2587697e13c5936647c92355e876fd171d3d445c9413414ef2cd41a2a84d46fd"} Dec 03 23:59:58 crc kubenswrapper[4764]: I1203 23:59:58.334268 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 23:59:58 crc kubenswrapper[4764]: I1203 23:59:58.335353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 23:59:59 crc kubenswrapper[4764]: I1203 23:59:59.817319 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 23:59:59 crc kubenswrapper[4764]: I1203 23:59:59.817691 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.157437 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9"] Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.165167 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.176729 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29413440-d789v"] Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.178068 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.183860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9"] Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.190232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29413440-d789v"] Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.192807 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.193360 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.193539 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.193699 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.308391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbjgn\" (UniqueName: \"kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.308919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.309246 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbp4\" (UniqueName: \"kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.309508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.309809 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.411530 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.411597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbjgn\" (UniqueName: \"kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.411667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.411690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbp4\" (UniqueName: \"kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.411736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.412551 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.413375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.443931 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbp4\" (UniqueName: \"kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4\") pod \"image-pruner-29413440-d789v\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.454477 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.472363 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbjgn\" (UniqueName: \"kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn\") pod \"collect-profiles-29413440-cjgm9\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.517936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 04 00:00:00 crc kubenswrapper[4764]: E1204 00:00:00.518110 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 00:00:00 crc kubenswrapper[4764]: E1204 00:00:00.518125 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 00:00:00 crc kubenswrapper[4764]: E1204 00:00:00.518166 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-04 00:00:08.518153478 +0000 UTC m=+1144.279477889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.532172 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:00 crc kubenswrapper[4764]: I1204 00:00:00.533342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:01 crc kubenswrapper[4764]: I1204 00:00:01.613149 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 00:00:04 crc kubenswrapper[4764]: W1204 00:00:04.769033 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec747d4_3aa0_4b3c_bb84_13776b506793.slice/crio-54396dbbc47e4ef05fe03261495a4457af6905f89ebe035dfa467bb352ed7eee WatchSource:0}: Error finding container 54396dbbc47e4ef05fe03261495a4457af6905f89ebe035dfa467bb352ed7eee: Status 404 returned error can't find the container with id 54396dbbc47e4ef05fe03261495a4457af6905f89ebe035dfa467bb352ed7eee Dec 04 00:00:04 crc kubenswrapper[4764]: W1204 00:00:04.769569 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc98b9272_87ec_43a2_97a7_7f08cdafbf2c.slice/crio-5cb902c52acaaba29ea9395d05b55ee1063eefcf74a2f271bef160e7314767c6 WatchSource:0}: Error finding container 5cb902c52acaaba29ea9395d05b55ee1063eefcf74a2f271bef160e7314767c6: Status 404 returned error can't find the container with id 5cb902c52acaaba29ea9395d05b55ee1063eefcf74a2f271bef160e7314767c6 Dec 04 00:00:04 crc kubenswrapper[4764]: I1204 00:00:04.849618 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 00:00:04 crc kubenswrapper[4764]: I1204 00:00:04.973248 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 00:00:04 crc kubenswrapper[4764]: I1204 00:00:04.998626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:04.999289 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.132996 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config\") pod \"b3d79ab9-f767-4aa3-b30a-266b311e436a\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133443 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc\") pod \"b3d79ab9-f767-4aa3-b30a-266b311e436a\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133488 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc\") pod \"28e9b357-81d0-4562-8796-f431986ea655\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb\") pod \"28e9b357-81d0-4562-8796-f431986ea655\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133629 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95\") pod \"28e9b357-81d0-4562-8796-f431986ea655\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config\") pod \"28e9b357-81d0-4562-8796-f431986ea655\" (UID: \"28e9b357-81d0-4562-8796-f431986ea655\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.133679 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phx4g\" (UniqueName: \"kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g\") pod \"b3d79ab9-f767-4aa3-b30a-266b311e436a\" (UID: \"b3d79ab9-f767-4aa3-b30a-266b311e436a\") " Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.144682 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g" (OuterVolumeSpecName: "kube-api-access-phx4g") pod "b3d79ab9-f767-4aa3-b30a-266b311e436a" (UID: "b3d79ab9-f767-4aa3-b30a-266b311e436a"). InnerVolumeSpecName "kube-api-access-phx4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.145359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95" (OuterVolumeSpecName: "kube-api-access-njj95") pod "28e9b357-81d0-4562-8796-f431986ea655" (UID: "28e9b357-81d0-4562-8796-f431986ea655"). InnerVolumeSpecName "kube-api-access-njj95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.184979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "28e9b357-81d0-4562-8796-f431986ea655" (UID: "28e9b357-81d0-4562-8796-f431986ea655"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.195197 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "28e9b357-81d0-4562-8796-f431986ea655" (UID: "28e9b357-81d0-4562-8796-f431986ea655"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.273908 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.275618 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.275650 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njj95\" (UniqueName: \"kubernetes.io/projected/28e9b357-81d0-4562-8796-f431986ea655-kube-api-access-njj95\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.275664 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phx4g\" (UniqueName: \"kubernetes.io/projected/b3d79ab9-f767-4aa3-b30a-266b311e436a-kube-api-access-phx4g\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.279298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config" (OuterVolumeSpecName: "config") pod "28e9b357-81d0-4562-8796-f431986ea655" (UID: "28e9b357-81d0-4562-8796-f431986ea655"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.307173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config" (OuterVolumeSpecName: "config") pod "b3d79ab9-f767-4aa3-b30a-266b311e436a" (UID: "b3d79ab9-f767-4aa3-b30a-266b311e436a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.317299 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3d79ab9-f767-4aa3-b30a-266b311e436a" (UID: "b3d79ab9-f767-4aa3-b30a-266b311e436a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.320859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29413440-d789v"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.362074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.378013 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.378043 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28e9b357-81d0-4562-8796-f431986ea655-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.378054 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3d79ab9-f767-4aa3-b30a-266b311e436a-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:05 crc kubenswrapper[4764]: W1204 00:00:05.566865 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc6683d2_f59f_42d2_8666_8960bce251af.slice/crio-386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb WatchSource:0}: Error finding container 386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb: Status 404 returned error can't find the container with id 386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.573986 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.678884 4764 generic.go:334] "Generic (PLEG): container finished" podID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerID="947902412f86c6c04759d68ebaf69307f99a482182aee5cf097d77911d8d56e7" exitCode=0 Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.678946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" event={"ID":"5ec747d4-3aa0-4b3c-bb84-13776b506793","Type":"ContainerDied","Data":"947902412f86c6c04759d68ebaf69307f99a482182aee5cf097d77911d8d56e7"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.678973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" event={"ID":"5ec747d4-3aa0-4b3c-bb84-13776b506793","Type":"ContainerStarted","Data":"54396dbbc47e4ef05fe03261495a4457af6905f89ebe035dfa467bb352ed7eee"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.683664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gb4t" event={"ID":"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a","Type":"ContainerStarted","Data":"0a7a50e6e0b034c7eecd5bfe06e2369a2fb9f54069521e04f4cbe475120f4ccd"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.685581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" event={"ID":"cc6683d2-f59f-42d2-8666-8960bce251af","Type":"ContainerStarted","Data":"386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.688536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8zvg9" event={"ID":"c98b9272-87ec-43a2-97a7-7f08cdafbf2c","Type":"ContainerStarted","Data":"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.688568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8zvg9" event={"ID":"c98b9272-87ec-43a2-97a7-7f08cdafbf2c","Type":"ContainerStarted","Data":"5cb902c52acaaba29ea9395d05b55ee1063eefcf74a2f271bef160e7314767c6"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.692367 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29413440-d789v" event={"ID":"bbfa8eba-6a2a-4791-bf84-a1eb93810e26","Type":"ContainerStarted","Data":"01a44262e2ca8802e11f6a33f412f4f98391281684ae76d4dfd0ff97beada19e"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.692399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29413440-d789v" event={"ID":"bbfa8eba-6a2a-4791-bf84-a1eb93810e26","Type":"ContainerStarted","Data":"acfcb50263c1debf610fdca3971684237f88a865e91e5caa10b5c7262db120f7"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.694108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" event={"ID":"b3d79ab9-f767-4aa3-b30a-266b311e436a","Type":"ContainerDied","Data":"ba999ba055f3758330d994465b86275c36ff8613ae5ef91adcacc324b5f4e778"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.694145 4764 scope.go:117] "RemoveContainer" containerID="2587697e13c5936647c92355e876fd171d3d445c9413414ef2cd41a2a84d46fd" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.695045 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.703631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" event={"ID":"28e9b357-81d0-4562-8796-f431986ea655","Type":"ContainerDied","Data":"e5df4030a6c11bf3b6c197a88b8190230f200694784947433a7b818e0df701ec"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.703800 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-555449b67f-kdvzw" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.710762 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerStarted","Data":"7888de431fcf4ed90fa1bb9b2d4c56fc3d113862df5a55e1232b18467adc43ef"} Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.731558 4764 scope.go:117] "RemoveContainer" containerID="a617a91c9e3194bb193c85c5995f063860249a71530afc25099c1200bd38ee3a" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.736919 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-8gb4t" podStartSLOduration=1.90180292 podStartE2EDuration="12.736897733s" podCreationTimestamp="2025-12-03 23:59:53 +0000 UTC" firstStartedPulling="2025-12-03 23:59:54.055384381 +0000 UTC m=+1129.816708792" lastFinishedPulling="2025-12-04 00:00:04.890479184 +0000 UTC m=+1140.651803605" observedRunningTime="2025-12-04 00:00:05.732251519 +0000 UTC m=+1141.493575930" watchObservedRunningTime="2025-12-04 00:00:05.736897733 +0000 UTC m=+1141.498222154" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.750910 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29413440-d789v" podStartSLOduration=5.750888286 podStartE2EDuration="5.750888286s" podCreationTimestamp="2025-12-04 00:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:05.750795233 +0000 UTC m=+1141.512119654" watchObservedRunningTime="2025-12-04 00:00:05.750888286 +0000 UTC m=+1141.512212697" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.776333 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8zvg9" podStartSLOduration=10.776317239 podStartE2EDuration="10.776317239s" podCreationTimestamp="2025-12-03 23:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:05.772033284 +0000 UTC m=+1141.533357695" watchObservedRunningTime="2025-12-04 00:00:05.776317239 +0000 UTC m=+1141.537641650" Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.839824 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.846076 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-555449b67f-kdvzw"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.879389 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.884385 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66c567d66c-dmd7s"] Dec 04 00:00:05 crc kubenswrapper[4764]: I1204 00:00:05.888668 4764 scope.go:117] "RemoveContainer" containerID="6f06074823e5c7bbd044c5cc25538a5709108a2ce7c2d76232f4d8c4dc05e57d" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.554726 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e9b357-81d0-4562-8796-f431986ea655" path="/var/lib/kubelet/pods/28e9b357-81d0-4562-8796-f431986ea655/volumes" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.555759 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" path="/var/lib/kubelet/pods/b3d79ab9-f767-4aa3-b30a-266b311e436a/volumes" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.558150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.665090 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.716269 4764 generic.go:334] "Generic (PLEG): container finished" podID="cc6683d2-f59f-42d2-8666-8960bce251af" containerID="016a7a62e13733816d8435cb64f114dba1e801e641a5f17e88eaee4b21fab7e7" exitCode=0 Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.716336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" event={"ID":"cc6683d2-f59f-42d2-8666-8960bce251af","Type":"ContainerDied","Data":"016a7a62e13733816d8435cb64f114dba1e801e641a5f17e88eaee4b21fab7e7"} Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.720276 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbfa8eba-6a2a-4791-bf84-a1eb93810e26" containerID="01a44262e2ca8802e11f6a33f412f4f98391281684ae76d4dfd0ff97beada19e" exitCode=0 Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.720343 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29413440-d789v" event={"ID":"bbfa8eba-6a2a-4791-bf84-a1eb93810e26","Type":"ContainerDied","Data":"01a44262e2ca8802e11f6a33f412f4f98391281684ae76d4dfd0ff97beada19e"} Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.724871 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" event={"ID":"5ec747d4-3aa0-4b3c-bb84-13776b506793","Type":"ContainerStarted","Data":"46106a5cc93080656e8561cd4ed244f717d4e08eb5d402bb091fea383467d78e"} Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.725637 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.769083 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" podStartSLOduration=11.769067516 podStartE2EDuration="11.769067516s" podCreationTimestamp="2025-12-03 23:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:06.76431617 +0000 UTC m=+1142.525640581" watchObservedRunningTime="2025-12-04 00:00:06.769067516 +0000 UTC m=+1142.530391927" Dec 04 00:00:06 crc kubenswrapper[4764]: I1204 00:00:06.845931 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66c567d66c-dmd7s" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: i/o timeout" Dec 04 00:00:07 crc kubenswrapper[4764]: I1204 00:00:07.743961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerStarted","Data":"60ba5da4792a87b1525f0ffb79e190a4fc0bc794c1d89f435442a97dc33fe1b1"} Dec 04 00:00:07 crc kubenswrapper[4764]: I1204 00:00:07.744429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerStarted","Data":"bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772"} Dec 04 00:00:07 crc kubenswrapper[4764]: I1204 00:00:07.745992 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 00:00:07 crc kubenswrapper[4764]: I1204 00:00:07.765472 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=11.243731493 podStartE2EDuration="12.765454843s" podCreationTimestamp="2025-12-03 23:59:55 +0000 UTC" firstStartedPulling="2025-12-04 00:00:05.368163849 +0000 UTC m=+1141.129488260" lastFinishedPulling="2025-12-04 00:00:06.889887199 +0000 UTC m=+1142.651211610" observedRunningTime="2025-12-04 00:00:07.761420975 +0000 UTC m=+1143.522745426" watchObservedRunningTime="2025-12-04 00:00:07.765454843 +0000 UTC m=+1143.526779254" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.058430 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.141908 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbp4\" (UniqueName: \"kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4\") pod \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.141991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca\") pod \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\" (UID: \"bbfa8eba-6a2a-4791-bf84-a1eb93810e26\") " Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.142581 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca" (OuterVolumeSpecName: "serviceca") pod "bbfa8eba-6a2a-4791-bf84-a1eb93810e26" (UID: "bbfa8eba-6a2a-4791-bf84-a1eb93810e26"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.146821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4" (OuterVolumeSpecName: "kube-api-access-ckbp4") pod "bbfa8eba-6a2a-4791-bf84-a1eb93810e26" (UID: "bbfa8eba-6a2a-4791-bf84-a1eb93810e26"). InnerVolumeSpecName "kube-api-access-ckbp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.243246 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbp4\" (UniqueName: \"kubernetes.io/projected/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-kube-api-access-ckbp4\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.243270 4764 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbfa8eba-6a2a-4791-bf84-a1eb93810e26-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.248683 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.344358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume\") pod \"cc6683d2-f59f-42d2-8666-8960bce251af\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.344400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume\") pod \"cc6683d2-f59f-42d2-8666-8960bce251af\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.344519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbjgn\" (UniqueName: \"kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn\") pod \"cc6683d2-f59f-42d2-8666-8960bce251af\" (UID: \"cc6683d2-f59f-42d2-8666-8960bce251af\") " Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.345507 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc6683d2-f59f-42d2-8666-8960bce251af" (UID: "cc6683d2-f59f-42d2-8666-8960bce251af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.348774 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc6683d2-f59f-42d2-8666-8960bce251af" (UID: "cc6683d2-f59f-42d2-8666-8960bce251af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.350968 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn" (OuterVolumeSpecName: "kube-api-access-qbjgn") pod "cc6683d2-f59f-42d2-8666-8960bce251af" (UID: "cc6683d2-f59f-42d2-8666-8960bce251af"). InnerVolumeSpecName "kube-api-access-qbjgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.446582 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc6683d2-f59f-42d2-8666-8960bce251af-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.446938 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc6683d2-f59f-42d2-8666-8960bce251af-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.446952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbjgn\" (UniqueName: \"kubernetes.io/projected/cc6683d2-f59f-42d2-8666-8960bce251af-kube-api-access-qbjgn\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.547843 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 04 00:00:08 crc kubenswrapper[4764]: E1204 00:00:08.548029 4764 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 00:00:08 crc kubenswrapper[4764]: E1204 00:00:08.548045 4764 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 00:00:08 crc kubenswrapper[4764]: E1204 00:00:08.548091 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift podName:1691fb5b-c57a-4773-9710-347c99bd9712 nodeName:}" failed. No retries permitted until 2025-12-04 00:00:24.548077388 +0000 UTC m=+1160.309401799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift") pod "swift-storage-0" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712") : configmap "swift-ring-files" not found Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.754323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" event={"ID":"cc6683d2-f59f-42d2-8666-8960bce251af","Type":"ContainerDied","Data":"386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb"} Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.754369 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386cdc3a465a9f66fef5810cbf15b1a27187df8eace3d8168fc3fd482dd40beb" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.754383 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.756252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29413440-d789v" event={"ID":"bbfa8eba-6a2a-4791-bf84-a1eb93810e26","Type":"ContainerDied","Data":"acfcb50263c1debf610fdca3971684237f88a865e91e5caa10b5c7262db120f7"} Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.756288 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acfcb50263c1debf610fdca3971684237f88a865e91e5caa10b5c7262db120f7" Dec 04 00:00:08 crc kubenswrapper[4764]: I1204 00:00:08.756901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29413440-d789v" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.687793 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3d02-account-create-update-97vbs"] Dec 04 00:00:09 crc kubenswrapper[4764]: E1204 00:00:09.688126 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e9b357-81d0-4562-8796-f431986ea655" containerName="init" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688140 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e9b357-81d0-4562-8796-f431986ea655" containerName="init" Dec 04 00:00:09 crc kubenswrapper[4764]: E1204 00:00:09.688150 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6683d2-f59f-42d2-8666-8960bce251af" containerName="collect-profiles" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688156 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6683d2-f59f-42d2-8666-8960bce251af" containerName="collect-profiles" Dec 04 00:00:09 crc kubenswrapper[4764]: E1204 00:00:09.688173 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbfa8eba-6a2a-4791-bf84-a1eb93810e26" containerName="image-pruner" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688178 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbfa8eba-6a2a-4791-bf84-a1eb93810e26" containerName="image-pruner" Dec 04 00:00:09 crc kubenswrapper[4764]: E1204 00:00:09.688191 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="dnsmasq-dns" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688197 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="dnsmasq-dns" Dec 04 00:00:09 crc kubenswrapper[4764]: E1204 00:00:09.688209 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="init" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688214 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="init" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688354 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6683d2-f59f-42d2-8666-8960bce251af" containerName="collect-profiles" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688366 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbfa8eba-6a2a-4791-bf84-a1eb93810e26" containerName="image-pruner" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688376 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d79ab9-f767-4aa3-b30a-266b311e436a" containerName="dnsmasq-dns" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688383 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e9b357-81d0-4562-8796-f431986ea655" containerName="init" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.688956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.692566 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.719543 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d02-account-create-update-97vbs"] Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.736157 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-2tx5s"] Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.737889 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.747982 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2tx5s"] Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.772077 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.772155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gntw\" (UniqueName: \"kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.878963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmnk\" (UniqueName: \"kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.879300 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.879329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gntw\" (UniqueName: \"kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.879402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.880006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.901249 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gntw\" (UniqueName: \"kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw\") pod \"keystone-3d02-account-create-update-97vbs\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.951588 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jzlcq"] Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.952493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.971840 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jzlcq"] Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.994547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmnk\" (UniqueName: \"kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.995040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:09 crc kubenswrapper[4764]: I1204 00:00:09.995757 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.007429 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.023696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmnk\" (UniqueName: \"kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk\") pod \"keystone-db-create-2tx5s\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.060269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.083877 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1158-account-create-update-vpcpl"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.085176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.087375 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.093651 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1158-account-create-update-vpcpl"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.097147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdxp\" (UniqueName: \"kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.097191 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.200590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdxp\" (UniqueName: \"kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.200940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.200968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.201064 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbslq\" (UniqueName: \"kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.201599 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.220672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdxp\" (UniqueName: \"kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp\") pod \"placement-db-create-jzlcq\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.289252 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-ccgpx"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.289578 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.290279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.301333 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ccgpx"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.304681 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbslq\" (UniqueName: \"kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.304963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.305743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.323474 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbslq\" (UniqueName: \"kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq\") pod \"placement-1158-account-create-update-vpcpl\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.406088 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h627\" (UniqueName: \"kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.406281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.439779 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6ca1-account-create-update-7rgls"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.440881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.445137 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.457363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6ca1-account-create-update-7rgls"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.476057 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.511105 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h627\" (UniqueName: \"kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.511239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.512574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.526583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3d02-account-create-update-97vbs"] Dec 04 00:00:10 crc kubenswrapper[4764]: W1204 00:00:10.545378 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod992d93ad_5b93_4369_adec_095082f4da81.slice/crio-31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced WatchSource:0}: Error finding container 31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced: Status 404 returned error can't find the container with id 31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.568349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h627\" (UniqueName: \"kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627\") pod \"glance-db-create-ccgpx\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.606883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.614998 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.615156 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrr2p\" (UniqueName: \"kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.661170 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-2tx5s"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.694336 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jzlcq"] Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.716975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrr2p\" (UniqueName: \"kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.717337 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.718199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.739940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrr2p\" (UniqueName: \"kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p\") pod \"glance-6ca1-account-create-update-7rgls\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.760867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.805678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2tx5s" event={"ID":"294a0599-742d-479d-9758-12c58c571da7","Type":"ContainerStarted","Data":"88ef18a94648fbdc751d4c4af67fdf573218114afad3a0b8e0ecafc81ae3173c"} Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.807997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d02-account-create-update-97vbs" event={"ID":"992d93ad-5b93-4369-adec-095082f4da81","Type":"ContainerStarted","Data":"31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced"} Dec 04 00:00:10 crc kubenswrapper[4764]: I1204 00:00:10.808889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzlcq" event={"ID":"b8181a75-e25f-462c-90cb-1fddeea8ae6c","Type":"ContainerStarted","Data":"e9a20e898a701d98d00f58708242257b8608372864176548aa6b7d0a2234e7ed"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.015250 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.070888 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.071096 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="dnsmasq-dns" containerID="cri-o://93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119" gracePeriod=10 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.090891 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1158-account-create-update-vpcpl"] Dec 04 00:00:11 crc kubenswrapper[4764]: W1204 00:00:11.101916 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1258d21b_9d01_4f1c_9508_ec94292425eb.slice/crio-afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1 WatchSource:0}: Error finding container afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1: Status 404 returned error can't find the container with id afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.155895 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-ccgpx"] Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.342340 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6ca1-account-create-update-7rgls"] Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.488683 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.636623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlxcw\" (UniqueName: \"kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw\") pod \"1378ad3a-99d0-47e1-8c66-74a67341e30d\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.636780 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config\") pod \"1378ad3a-99d0-47e1-8c66-74a67341e30d\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.636913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc\") pod \"1378ad3a-99d0-47e1-8c66-74a67341e30d\" (UID: \"1378ad3a-99d0-47e1-8c66-74a67341e30d\") " Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.645792 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw" (OuterVolumeSpecName: "kube-api-access-vlxcw") pod "1378ad3a-99d0-47e1-8c66-74a67341e30d" (UID: "1378ad3a-99d0-47e1-8c66-74a67341e30d"). InnerVolumeSpecName "kube-api-access-vlxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.649511 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlxcw\" (UniqueName: \"kubernetes.io/projected/1378ad3a-99d0-47e1-8c66-74a67341e30d-kube-api-access-vlxcw\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.673321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1378ad3a-99d0-47e1-8c66-74a67341e30d" (UID: "1378ad3a-99d0-47e1-8c66-74a67341e30d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.687236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config" (OuterVolumeSpecName: "config") pod "1378ad3a-99d0-47e1-8c66-74a67341e30d" (UID: "1378ad3a-99d0-47e1-8c66-74a67341e30d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.751391 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.751436 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1378ad3a-99d0-47e1-8c66-74a67341e30d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.817597 4764 generic.go:334] "Generic (PLEG): container finished" podID="294a0599-742d-479d-9758-12c58c571da7" containerID="a1651179f0300b2698a60d331e5872234d6b46c899dc9fbf2c21a9e9a05ec719" exitCode=0 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.817648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2tx5s" event={"ID":"294a0599-742d-479d-9758-12c58c571da7","Type":"ContainerDied","Data":"a1651179f0300b2698a60d331e5872234d6b46c899dc9fbf2c21a9e9a05ec719"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.820934 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ca1-account-create-update-7rgls" event={"ID":"e7263a5a-cae0-4bbb-8429-d4d59d79c63b","Type":"ContainerStarted","Data":"f8de5b50ece874cdc4fb530bcf1a5c776706712455e9b3f425f252f6afefe588"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.821066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ca1-account-create-update-7rgls" event={"ID":"e7263a5a-cae0-4bbb-8429-d4d59d79c63b","Type":"ContainerStarted","Data":"8b99f4badd50c617104a535abc62b610cc4f4c9356d7e9862218e8b9293e1f7f"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.822430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ccgpx" event={"ID":"9f6f11e7-3aec-43c4-a35d-13882953a668","Type":"ContainerStarted","Data":"2abca4237fb017c47159b6d2841736e119b9cc88ea0295253fb1efe2f7642af6"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.822452 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ccgpx" event={"ID":"9f6f11e7-3aec-43c4-a35d-13882953a668","Type":"ContainerStarted","Data":"21b231a2b5d01e43ec38a089e633e1563385a35bbf30e6dbf98cf2ad6b4f229d"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.823355 4764 generic.go:334] "Generic (PLEG): container finished" podID="992d93ad-5b93-4369-adec-095082f4da81" containerID="3f2b39a4c4fb701732f068df4fd94e2b91815aa2dc381f6953b092ba79953e8d" exitCode=0 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.823390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d02-account-create-update-97vbs" event={"ID":"992d93ad-5b93-4369-adec-095082f4da81","Type":"ContainerDied","Data":"3f2b39a4c4fb701732f068df4fd94e2b91815aa2dc381f6953b092ba79953e8d"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.824274 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1158-account-create-update-vpcpl" event={"ID":"1258d21b-9d01-4f1c-9508-ec94292425eb","Type":"ContainerStarted","Data":"b663718dd91cec728714f509ace36974bc05c1712376e3e482576b0703933bac"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.824294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1158-account-create-update-vpcpl" event={"ID":"1258d21b-9d01-4f1c-9508-ec94292425eb","Type":"ContainerStarted","Data":"afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.837264 4764 generic.go:334] "Generic (PLEG): container finished" podID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerID="93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119" exitCode=0 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.837321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" event={"ID":"1378ad3a-99d0-47e1-8c66-74a67341e30d","Type":"ContainerDied","Data":"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.837345 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" event={"ID":"1378ad3a-99d0-47e1-8c66-74a67341e30d","Type":"ContainerDied","Data":"62709bbc819ce788c1ac91e657496bf29833bdb4cd3472957e498526c771d492"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.837361 4764 scope.go:117] "RemoveContainer" containerID="93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.837442 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-bc2xr" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.839704 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8181a75-e25f-462c-90cb-1fddeea8ae6c" containerID="f2a95f73654fe1711cc05ab55f3be9597380a0876777040a000064c07657a226" exitCode=0 Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.839789 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzlcq" event={"ID":"b8181a75-e25f-462c-90cb-1fddeea8ae6c","Type":"ContainerDied","Data":"f2a95f73654fe1711cc05ab55f3be9597380a0876777040a000064c07657a226"} Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.861371 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-6ca1-account-create-update-7rgls" podStartSLOduration=1.86135247 podStartE2EDuration="1.86135247s" podCreationTimestamp="2025-12-04 00:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:11.853289222 +0000 UTC m=+1147.614613643" watchObservedRunningTime="2025-12-04 00:00:11.86135247 +0000 UTC m=+1147.622676881" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.872057 4764 scope.go:117] "RemoveContainer" containerID="971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.911358 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-1158-account-create-update-vpcpl" podStartSLOduration=1.911342036 podStartE2EDuration="1.911342036s" podCreationTimestamp="2025-12-04 00:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:11.909649394 +0000 UTC m=+1147.670973825" watchObservedRunningTime="2025-12-04 00:00:11.911342036 +0000 UTC m=+1147.672666447" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.911666 4764 scope.go:117] "RemoveContainer" containerID="93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119" Dec 04 00:00:11 crc kubenswrapper[4764]: E1204 00:00:11.912344 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119\": container with ID starting with 93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119 not found: ID does not exist" containerID="93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.912391 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119"} err="failed to get container status \"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119\": rpc error: code = NotFound desc = could not find container \"93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119\": container with ID starting with 93cbf70e736f4b6cb6f978b706226f88210ae9fd99c714a1caa1406614282119 not found: ID does not exist" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.912421 4764 scope.go:117] "RemoveContainer" containerID="971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998" Dec 04 00:00:11 crc kubenswrapper[4764]: E1204 00:00:11.912861 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998\": container with ID starting with 971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998 not found: ID does not exist" containerID="971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.912891 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998"} err="failed to get container status \"971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998\": rpc error: code = NotFound desc = could not find container \"971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998\": container with ID starting with 971c20af01c842786d061a25f0ef9ae7c03bd470941bae2dc290a8bff920c998 not found: ID does not exist" Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.928483 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 04 00:00:11 crc kubenswrapper[4764]: I1204 00:00:11.936383 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-bc2xr"] Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.562468 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" path="/var/lib/kubelet/pods/1378ad3a-99d0-47e1-8c66-74a67341e30d/volumes" Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.850078 4764 generic.go:334] "Generic (PLEG): container finished" podID="1258d21b-9d01-4f1c-9508-ec94292425eb" containerID="b663718dd91cec728714f509ace36974bc05c1712376e3e482576b0703933bac" exitCode=0 Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.850142 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1158-account-create-update-vpcpl" event={"ID":"1258d21b-9d01-4f1c-9508-ec94292425eb","Type":"ContainerDied","Data":"b663718dd91cec728714f509ace36974bc05c1712376e3e482576b0703933bac"} Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.854481 4764 generic.go:334] "Generic (PLEG): container finished" podID="e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" containerID="0a7a50e6e0b034c7eecd5bfe06e2369a2fb9f54069521e04f4cbe475120f4ccd" exitCode=0 Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.854529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gb4t" event={"ID":"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a","Type":"ContainerDied","Data":"0a7a50e6e0b034c7eecd5bfe06e2369a2fb9f54069521e04f4cbe475120f4ccd"} Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.856839 4764 generic.go:334] "Generic (PLEG): container finished" podID="e7263a5a-cae0-4bbb-8429-d4d59d79c63b" containerID="f8de5b50ece874cdc4fb530bcf1a5c776706712455e9b3f425f252f6afefe588" exitCode=0 Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.856923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ca1-account-create-update-7rgls" event={"ID":"e7263a5a-cae0-4bbb-8429-d4d59d79c63b","Type":"ContainerDied","Data":"f8de5b50ece874cdc4fb530bcf1a5c776706712455e9b3f425f252f6afefe588"} Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.858711 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f6f11e7-3aec-43c4-a35d-13882953a668" containerID="2abca4237fb017c47159b6d2841736e119b9cc88ea0295253fb1efe2f7642af6" exitCode=0 Dec 04 00:00:12 crc kubenswrapper[4764]: I1204 00:00:12.858756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ccgpx" event={"ID":"9f6f11e7-3aec-43c4-a35d-13882953a668","Type":"ContainerDied","Data":"2abca4237fb017c47159b6d2841736e119b9cc88ea0295253fb1efe2f7642af6"} Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.196389 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.282415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts\") pod \"294a0599-742d-479d-9758-12c58c571da7\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.282704 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjmnk\" (UniqueName: \"kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk\") pod \"294a0599-742d-479d-9758-12c58c571da7\" (UID: \"294a0599-742d-479d-9758-12c58c571da7\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.284245 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "294a0599-742d-479d-9758-12c58c571da7" (UID: "294a0599-742d-479d-9758-12c58c571da7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.288674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk" (OuterVolumeSpecName: "kube-api-access-jjmnk") pod "294a0599-742d-479d-9758-12c58c571da7" (UID: "294a0599-742d-479d-9758-12c58c571da7"). InnerVolumeSpecName "kube-api-access-jjmnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.373345 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.379126 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.383872 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.383973 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjmnk\" (UniqueName: \"kubernetes.io/projected/294a0599-742d-479d-9758-12c58c571da7-kube-api-access-jjmnk\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.383990 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294a0599-742d-479d-9758-12c58c571da7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484620 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fdxp\" (UniqueName: \"kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp\") pod \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484681 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts\") pod \"992d93ad-5b93-4369-adec-095082f4da81\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts\") pod \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\" (UID: \"b8181a75-e25f-462c-90cb-1fddeea8ae6c\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484783 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gntw\" (UniqueName: \"kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw\") pod \"992d93ad-5b93-4369-adec-095082f4da81\" (UID: \"992d93ad-5b93-4369-adec-095082f4da81\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts\") pod \"9f6f11e7-3aec-43c4-a35d-13882953a668\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.484869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h627\" (UniqueName: \"kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627\") pod \"9f6f11e7-3aec-43c4-a35d-13882953a668\" (UID: \"9f6f11e7-3aec-43c4-a35d-13882953a668\") " Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.485711 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8181a75-e25f-462c-90cb-1fddeea8ae6c" (UID: "b8181a75-e25f-462c-90cb-1fddeea8ae6c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.485973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "992d93ad-5b93-4369-adec-095082f4da81" (UID: "992d93ad-5b93-4369-adec-095082f4da81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.485984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9f6f11e7-3aec-43c4-a35d-13882953a668" (UID: "9f6f11e7-3aec-43c4-a35d-13882953a668"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.488270 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp" (OuterVolumeSpecName: "kube-api-access-9fdxp") pod "b8181a75-e25f-462c-90cb-1fddeea8ae6c" (UID: "b8181a75-e25f-462c-90cb-1fddeea8ae6c"). InnerVolumeSpecName "kube-api-access-9fdxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.488583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627" (OuterVolumeSpecName: "kube-api-access-8h627") pod "9f6f11e7-3aec-43c4-a35d-13882953a668" (UID: "9f6f11e7-3aec-43c4-a35d-13882953a668"). InnerVolumeSpecName "kube-api-access-8h627". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.489332 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw" (OuterVolumeSpecName: "kube-api-access-4gntw") pod "992d93ad-5b93-4369-adec-095082f4da81" (UID: "992d93ad-5b93-4369-adec-095082f4da81"). InnerVolumeSpecName "kube-api-access-4gntw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586888 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/992d93ad-5b93-4369-adec-095082f4da81-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586919 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8181a75-e25f-462c-90cb-1fddeea8ae6c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586930 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gntw\" (UniqueName: \"kubernetes.io/projected/992d93ad-5b93-4369-adec-095082f4da81-kube-api-access-4gntw\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586942 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9f6f11e7-3aec-43c4-a35d-13882953a668-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586972 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h627\" (UniqueName: \"kubernetes.io/projected/9f6f11e7-3aec-43c4-a35d-13882953a668-kube-api-access-8h627\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.586980 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fdxp\" (UniqueName: \"kubernetes.io/projected/b8181a75-e25f-462c-90cb-1fddeea8ae6c-kube-api-access-9fdxp\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.870178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-ccgpx" event={"ID":"9f6f11e7-3aec-43c4-a35d-13882953a668","Type":"ContainerDied","Data":"21b231a2b5d01e43ec38a089e633e1563385a35bbf30e6dbf98cf2ad6b4f229d"} Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.870221 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21b231a2b5d01e43ec38a089e633e1563385a35bbf30e6dbf98cf2ad6b4f229d" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.870290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-ccgpx" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.875950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3d02-account-create-update-97vbs" event={"ID":"992d93ad-5b93-4369-adec-095082f4da81","Type":"ContainerDied","Data":"31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced"} Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.875993 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31b030532f73102c9c26c88537eb0ab7095b6a6f301e21b63b275218ceaffced" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.876082 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3d02-account-create-update-97vbs" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.891761 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzlcq" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.891771 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzlcq" event={"ID":"b8181a75-e25f-462c-90cb-1fddeea8ae6c","Type":"ContainerDied","Data":"e9a20e898a701d98d00f58708242257b8608372864176548aa6b7d0a2234e7ed"} Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.891819 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a20e898a701d98d00f58708242257b8608372864176548aa6b7d0a2234e7ed" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.897818 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-2tx5s" Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.900875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-2tx5s" event={"ID":"294a0599-742d-479d-9758-12c58c571da7","Type":"ContainerDied","Data":"88ef18a94648fbdc751d4c4af67fdf573218114afad3a0b8e0ecafc81ae3173c"} Dec 04 00:00:13 crc kubenswrapper[4764]: I1204 00:00:13.900928 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88ef18a94648fbdc751d4c4af67fdf573218114afad3a0b8e0ecafc81ae3173c" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.252987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.351019 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gb4t" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.355576 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.401355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts\") pod \"1258d21b-9d01-4f1c-9508-ec94292425eb\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.401425 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbslq\" (UniqueName: \"kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq\") pod \"1258d21b-9d01-4f1c-9508-ec94292425eb\" (UID: \"1258d21b-9d01-4f1c-9508-ec94292425eb\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.402277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1258d21b-9d01-4f1c-9508-ec94292425eb" (UID: "1258d21b-9d01-4f1c-9508-ec94292425eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.410845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq" (OuterVolumeSpecName: "kube-api-access-pbslq") pod "1258d21b-9d01-4f1c-9508-ec94292425eb" (UID: "1258d21b-9d01-4f1c-9508-ec94292425eb"). InnerVolumeSpecName "kube-api-access-pbslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503192 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrr2p\" (UniqueName: \"kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p\") pod \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503247 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8crrj\" (UniqueName: \"kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503302 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503817 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503861 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts\") pod \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\" (UID: \"e7263a5a-cae0-4bbb-8429-d4d59d79c63b\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.503983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf\") pod \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\" (UID: \"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a\") " Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504378 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504429 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7263a5a-cae0-4bbb-8429-d4d59d79c63b" (UID: "e7263a5a-cae0-4bbb-8429-d4d59d79c63b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504436 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1258d21b-9d01-4f1c-9508-ec94292425eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504475 4764 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.504490 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbslq\" (UniqueName: \"kubernetes.io/projected/1258d21b-9d01-4f1c-9508-ec94292425eb-kube-api-access-pbslq\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.507738 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj" (OuterVolumeSpecName: "kube-api-access-8crrj") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "kube-api-access-8crrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.507908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p" (OuterVolumeSpecName: "kube-api-access-mrr2p") pod "e7263a5a-cae0-4bbb-8429-d4d59d79c63b" (UID: "e7263a5a-cae0-4bbb-8429-d4d59d79c63b"). InnerVolumeSpecName "kube-api-access-mrr2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.509591 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.521871 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts" (OuterVolumeSpecName: "scripts") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.523149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.526523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" (UID: "e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605791 4764 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605820 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605829 4764 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605838 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605848 4764 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605855 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605865 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrr2p\" (UniqueName: \"kubernetes.io/projected/e7263a5a-cae0-4bbb-8429-d4d59d79c63b-kube-api-access-mrr2p\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.605875 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8crrj\" (UniqueName: \"kubernetes.io/projected/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a-kube-api-access-8crrj\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.926544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1158-account-create-update-vpcpl" event={"ID":"1258d21b-9d01-4f1c-9508-ec94292425eb","Type":"ContainerDied","Data":"afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1"} Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.926583 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afdcea68707ffb80abc552abfe7b57305bf3e31924f21385f53795e73aebc6c1" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.926632 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1158-account-create-update-vpcpl" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.928862 4764 generic.go:334] "Generic (PLEG): container finished" podID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerID="6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11" exitCode=0 Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.928933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerDied","Data":"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11"} Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.936572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-8gb4t" event={"ID":"e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a","Type":"ContainerDied","Data":"135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0"} Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.936625 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135701ae63a732cdf6faa798f8fa43f54a5fff8b8574558211470d9a951fdec0" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.936697 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-8gb4t" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.943235 4764 generic.go:334] "Generic (PLEG): container finished" podID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerID="309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0" exitCode=0 Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.943290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerDied","Data":"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0"} Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.953468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ca1-account-create-update-7rgls" event={"ID":"e7263a5a-cae0-4bbb-8429-d4d59d79c63b","Type":"ContainerDied","Data":"8b99f4badd50c617104a535abc62b610cc4f4c9356d7e9862218e8b9293e1f7f"} Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.953518 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b99f4badd50c617104a535abc62b610cc4f4c9356d7e9862218e8b9293e1f7f" Dec 04 00:00:14 crc kubenswrapper[4764]: I1204 00:00:14.953696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ca1-account-create-update-7rgls" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544073 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-qwhzb"] Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544442 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8181a75-e25f-462c-90cb-1fddeea8ae6c" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544461 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8181a75-e25f-462c-90cb-1fddeea8ae6c" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544474 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7263a5a-cae0-4bbb-8429-d4d59d79c63b" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544481 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7263a5a-cae0-4bbb-8429-d4d59d79c63b" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544493 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992d93ad-5b93-4369-adec-095082f4da81" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544499 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="992d93ad-5b93-4369-adec-095082f4da81" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544510 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="dnsmasq-dns" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="dnsmasq-dns" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544524 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="init" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544530 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="init" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544547 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1258d21b-9d01-4f1c-9508-ec94292425eb" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544553 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1258d21b-9d01-4f1c-9508-ec94292425eb" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544568 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294a0599-742d-479d-9758-12c58c571da7" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544575 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="294a0599-742d-479d-9758-12c58c571da7" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544587 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6f11e7-3aec-43c4-a35d-13882953a668" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544594 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6f11e7-3aec-43c4-a35d-13882953a668" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: E1204 00:00:15.544604 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" containerName="swift-ring-rebalance" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544611 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" containerName="swift-ring-rebalance" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544796 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8181a75-e25f-462c-90cb-1fddeea8ae6c" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544806 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1378ad3a-99d0-47e1-8c66-74a67341e30d" containerName="dnsmasq-dns" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544817 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="294a0599-742d-479d-9758-12c58c571da7" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544826 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="992d93ad-5b93-4369-adec-095082f4da81" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544834 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1258d21b-9d01-4f1c-9508-ec94292425eb" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544844 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7263a5a-cae0-4bbb-8429-d4d59d79c63b" containerName="mariadb-account-create-update" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544853 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" containerName="swift-ring-rebalance" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.544862 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6f11e7-3aec-43c4-a35d-13882953a668" containerName="mariadb-database-create" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.545455 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.547449 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.547700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xwh7d" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.587207 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qwhzb"] Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.634868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85vq\" (UniqueName: \"kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.634919 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.635041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.635061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.736664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.736705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.736789 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85vq\" (UniqueName: \"kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.736808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.742000 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.742285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.742813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.755314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85vq\" (UniqueName: \"kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq\") pod \"glance-db-sync-qwhzb\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:15 crc kubenswrapper[4764]: I1204 00:00:15.870445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qwhzb" Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.015813 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerStarted","Data":"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf"} Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.016347 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.020924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerStarted","Data":"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972"} Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.021122 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.047077 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.574476978 podStartE2EDuration="51.047058649s" podCreationTimestamp="2025-12-03 23:59:25 +0000 UTC" firstStartedPulling="2025-12-03 23:59:31.200702525 +0000 UTC m=+1106.962026936" lastFinishedPulling="2025-12-03 23:59:41.673284196 +0000 UTC m=+1117.434608607" observedRunningTime="2025-12-04 00:00:16.040545759 +0000 UTC m=+1151.801870180" watchObservedRunningTime="2025-12-04 00:00:16.047058649 +0000 UTC m=+1151.808383060" Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.072165 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.567222459 podStartE2EDuration="51.072143734s" podCreationTimestamp="2025-12-03 23:59:25 +0000 UTC" firstStartedPulling="2025-12-03 23:59:31.192381081 +0000 UTC m=+1106.953705492" lastFinishedPulling="2025-12-03 23:59:41.697302366 +0000 UTC m=+1117.458626767" observedRunningTime="2025-12-04 00:00:16.061005961 +0000 UTC m=+1151.822330372" watchObservedRunningTime="2025-12-04 00:00:16.072143734 +0000 UTC m=+1151.833468145" Dec 04 00:00:16 crc kubenswrapper[4764]: I1204 00:00:16.447529 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-qwhzb"] Dec 04 00:00:16 crc kubenswrapper[4764]: W1204 00:00:16.447522 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c4fa015_bd9b_44c9_a09b_41630154ec52.slice/crio-306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509 WatchSource:0}: Error finding container 306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509: Status 404 returned error can't find the container with id 306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509 Dec 04 00:00:17 crc kubenswrapper[4764]: I1204 00:00:17.041477 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qwhzb" event={"ID":"6c4fa015-bd9b-44c9-a09b-41630154ec52","Type":"ContainerStarted","Data":"306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509"} Dec 04 00:00:21 crc kubenswrapper[4764]: I1204 00:00:21.314471 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.296935 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l2vv9" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" probeResult="failure" output=< Dec 04 00:00:22 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 00:00:22 crc kubenswrapper[4764]: > Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.298958 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xnsqq" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.365559 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-xnsqq" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.569886 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l2vv9-config-fkb2r"] Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.570847 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.583896 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.606823 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l2vv9-config-fkb2r"] Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.652848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.652901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.652964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.652990 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.653005 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.653092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kllvg\" (UniqueName: \"kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.754780 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.754857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.754897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.754920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.754937 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.755021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kllvg\" (UniqueName: \"kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.755234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.755234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.755295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.756864 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.757243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.774156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kllvg\" (UniqueName: \"kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg\") pod \"ovn-controller-l2vv9-config-fkb2r\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:22 crc kubenswrapper[4764]: I1204 00:00:22.907628 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:24 crc kubenswrapper[4764]: I1204 00:00:24.588229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 04 00:00:24 crc kubenswrapper[4764]: I1204 00:00:24.596290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"swift-storage-0\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " pod="openstack/swift-storage-0" Dec 04 00:00:24 crc kubenswrapper[4764]: I1204 00:00:24.886929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.208937 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.301858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.531479 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l2vv9" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" probeResult="failure" output=< Dec 04 00:00:27 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 00:00:27 crc kubenswrapper[4764]: > Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.711981 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c616-account-create-update-x6pww"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.713037 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.720900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.737475 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-sphs6"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.738820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.746125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55klf\" (UniqueName: \"kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.746303 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.761966 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c616-account-create-update-x6pww"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.794996 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sphs6"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.827782 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-g9l4t"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.829082 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.835826 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2a5d-account-create-update-7b2vr"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.836846 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.840225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.845565 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g9l4t"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55klf\" (UniqueName: \"kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgqd\" (UniqueName: \"kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847774 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4x9\" (UniqueName: \"kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847906 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.847974 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.848007 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r696t\" (UniqueName: \"kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.848802 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.854641 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2a5d-account-create-update-7b2vr"] Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.899532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55klf\" (UniqueName: \"kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf\") pod \"barbican-c616-account-create-update-x6pww\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4x9\" (UniqueName: \"kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949661 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r696t\" (UniqueName: \"kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.949782 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgqd\" (UniqueName: \"kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.950771 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.950819 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.951196 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.967066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgqd\" (UniqueName: \"kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd\") pod \"cinder-db-create-sphs6\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.967899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r696t\" (UniqueName: \"kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t\") pod \"barbican-db-create-g9l4t\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:27 crc kubenswrapper[4764]: I1204 00:00:27.968294 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4x9\" (UniqueName: \"kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9\") pod \"cinder-2a5d-account-create-update-7b2vr\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.030870 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.054313 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.057214 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-c2ls4"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.058199 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.060814 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.061670 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6jnw5" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.061999 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.062166 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.109323 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c2ls4"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.124771 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6bz55"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.126099 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.143032 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6bz55"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.156233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.156281 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxnnk\" (UniqueName: \"kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.156311 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.156528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmpl\" (UniqueName: \"kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.156587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.157983 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4ef6-account-create-update-hgk4l"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.158295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.159566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.162990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.167369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.180552 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4ef6-account-create-update-hgk4l"] Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxnnk\" (UniqueName: \"kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhbk\" (UniqueName: \"kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258645 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmpl\" (UniqueName: \"kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.258669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.259457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.261954 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.262431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.276691 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxnnk\" (UniqueName: \"kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk\") pod \"neutron-db-create-6bz55\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.278190 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmpl\" (UniqueName: \"kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl\") pod \"keystone-db-sync-c2ls4\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.360410 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.360550 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhbk\" (UniqueName: \"kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.361491 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.377571 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.380916 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhbk\" (UniqueName: \"kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk\") pod \"neutron-4ef6-account-create-update-hgk4l\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.456216 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:28 crc kubenswrapper[4764]: I1204 00:00:28.483899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:29 crc kubenswrapper[4764]: E1204 00:00:29.048825 4764 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.13:37104->38.102.83.13:39483: write tcp 192.168.126.11:10250->192.168.126.11:40584: write: broken pipe Dec 04 00:00:31 crc kubenswrapper[4764]: E1204 00:00:31.864329 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63" Dec 04 00:00:31 crc kubenswrapper[4764]: E1204 00:00:31.864922 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g85vq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-qwhzb_openstack(6c4fa015-bd9b-44c9-a09b-41630154ec52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 00:00:31 crc kubenswrapper[4764]: E1204 00:00:31.866503 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-qwhzb" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" Dec 04 00:00:32 crc kubenswrapper[4764]: E1204 00:00:32.182193 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:481073ac9deefb38bbd989aaa8dd7aedb4e0af26017f4883f85fce433380bf63\\\"\"" pod="openstack/glance-db-sync-qwhzb" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.304129 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l2vv9" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" probeResult="failure" output=< Dec 04 00:00:32 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 00:00:32 crc kubenswrapper[4764]: > Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.574951 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2a5d-account-create-update-7b2vr"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.574994 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c616-account-create-update-x6pww"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.580482 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-g9l4t"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.588934 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-c2ls4"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.693944 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4ef6-account-create-update-hgk4l"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.702539 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l2vv9-config-fkb2r"] Dec 04 00:00:32 crc kubenswrapper[4764]: W1204 00:00:32.707260 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8739268e_ea69_4e17_a73c_90737c72f040.slice/crio-87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a WatchSource:0}: Error finding container 87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a: Status 404 returned error can't find the container with id 87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.713584 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6bz55"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.836662 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-sphs6"] Dec 04 00:00:32 crc kubenswrapper[4764]: I1204 00:00:32.951709 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 00:00:33 crc kubenswrapper[4764]: W1204 00:00:33.015211 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1691fb5b_c57a_4773_9710_347c99bd9712.slice/crio-62535ef31ad81bf101df72585b98d5c55b463ae8889b980ee0b9e4ef0c4b917e WatchSource:0}: Error finding container 62535ef31ad81bf101df72585b98d5c55b463ae8889b980ee0b9e4ef0c4b917e: Status 404 returned error can't find the container with id 62535ef31ad81bf101df72585b98d5c55b463ae8889b980ee0b9e4ef0c4b917e Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.186228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9-config-fkb2r" event={"ID":"8739268e-ea69-4e17-a73c-90737c72f040","Type":"ContainerStarted","Data":"87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.187572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c2ls4" event={"ID":"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98","Type":"ContainerStarted","Data":"01039d96b0145c5e96c468441bc955c36cbf2b503bb148527ca8cee30dc04f6f"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.188842 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bz55" event={"ID":"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122","Type":"ContainerStarted","Data":"b1460149953d285259b41b408645d604c6d7babcebd9204034ade8cfe14e6a96"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.190481 4764 generic.go:334] "Generic (PLEG): container finished" podID="4b156df9-8487-4d3c-ae04-53a8ac281484" containerID="4cde283132633ce22f471f0f58be753fa22ff0b0595a209ec83ed97ac2675ac1" exitCode=0 Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.190560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g9l4t" event={"ID":"4b156df9-8487-4d3c-ae04-53a8ac281484","Type":"ContainerDied","Data":"4cde283132633ce22f471f0f58be753fa22ff0b0595a209ec83ed97ac2675ac1"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.190596 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g9l4t" event={"ID":"4b156df9-8487-4d3c-ae04-53a8ac281484","Type":"ContainerStarted","Data":"070e1e50bb5c7a8498f6b5e0028c7226973582ea4de1630d83c43cd9133586d2"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.191784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4ef6-account-create-update-hgk4l" event={"ID":"828472d4-0a60-47d4-906b-107c9cb63417","Type":"ContainerStarted","Data":"f362a948c608b3bac1d2966914eef2df7e6fb1d71bf9530591205f1d0fe8e4bf"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.196320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"62535ef31ad81bf101df72585b98d5c55b463ae8889b980ee0b9e4ef0c4b917e"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.199970 4764 generic.go:334] "Generic (PLEG): container finished" podID="a78c3705-cf1f-41d5-b80b-323d167e7cba" containerID="673b3fe77d5ce18df617b4e4a03402aed38e4d2372f2fea89273e416b06250ea" exitCode=0 Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.200067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c616-account-create-update-x6pww" event={"ID":"a78c3705-cf1f-41d5-b80b-323d167e7cba","Type":"ContainerDied","Data":"673b3fe77d5ce18df617b4e4a03402aed38e4d2372f2fea89273e416b06250ea"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.200113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c616-account-create-update-x6pww" event={"ID":"a78c3705-cf1f-41d5-b80b-323d167e7cba","Type":"ContainerStarted","Data":"04bcd8e4968dc473444ffd2a1432f81990748925a48ae570a2c93c182374157b"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.201902 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" containerID="aeeb1f8cd6a357fb4921a9a73369f690e025e106efab990b4dd29d561bf85075" exitCode=0 Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.201988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a5d-account-create-update-7b2vr" event={"ID":"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938","Type":"ContainerDied","Data":"aeeb1f8cd6a357fb4921a9a73369f690e025e106efab990b4dd29d561bf85075"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.202025 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a5d-account-create-update-7b2vr" event={"ID":"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938","Type":"ContainerStarted","Data":"9b705dbe69c592c7b2ed778296b9c631467defc673ebf2cd1bccc3744c0f8daa"} Dec 04 00:00:33 crc kubenswrapper[4764]: I1204 00:00:33.210487 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sphs6" event={"ID":"eb6d78be-5cb8-49c0-9cde-4bfae85c513a","Type":"ContainerStarted","Data":"78824261779fcbdf56975c254235eabb966506a030182db3226b9b209a3fccdf"} Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.222236 4764 generic.go:334] "Generic (PLEG): container finished" podID="8739268e-ea69-4e17-a73c-90737c72f040" containerID="8abc6b33fb7fe9bf79a76e4d8b9e725d01e6670c7d048013f125d40d01b42df8" exitCode=0 Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.222358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9-config-fkb2r" event={"ID":"8739268e-ea69-4e17-a73c-90737c72f040","Type":"ContainerDied","Data":"8abc6b33fb7fe9bf79a76e4d8b9e725d01e6670c7d048013f125d40d01b42df8"} Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.226058 4764 generic.go:334] "Generic (PLEG): container finished" podID="3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" containerID="ed9a7bb446932db4138c96413b14192c82a00e714736a59bb5dacccc3da2a6dd" exitCode=0 Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.226112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bz55" event={"ID":"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122","Type":"ContainerDied","Data":"ed9a7bb446932db4138c96413b14192c82a00e714736a59bb5dacccc3da2a6dd"} Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.231699 4764 generic.go:334] "Generic (PLEG): container finished" podID="eb6d78be-5cb8-49c0-9cde-4bfae85c513a" containerID="d13466ee54bb26aca109c5905a35596e448e0e1848d2f230f30f0947a39a72a8" exitCode=0 Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.231736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sphs6" event={"ID":"eb6d78be-5cb8-49c0-9cde-4bfae85c513a","Type":"ContainerDied","Data":"d13466ee54bb26aca109c5905a35596e448e0e1848d2f230f30f0947a39a72a8"} Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.242109 4764 generic.go:334] "Generic (PLEG): container finished" podID="828472d4-0a60-47d4-906b-107c9cb63417" containerID="058b0b320e4f649818a6f21558010cb53f95ece570ce4947f8e23a700f404e0f" exitCode=0 Dec 04 00:00:34 crc kubenswrapper[4764]: I1204 00:00:34.242234 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4ef6-account-create-update-hgk4l" event={"ID":"828472d4-0a60-47d4-906b-107c9cb63417","Type":"ContainerDied","Data":"058b0b320e4f649818a6f21558010cb53f95ece570ce4947f8e23a700f404e0f"} Dec 04 00:00:35 crc kubenswrapper[4764]: I1204 00:00:35.258370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc"} Dec 04 00:00:35 crc kubenswrapper[4764]: I1204 00:00:35.258687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.117987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.148303 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.172033 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.180915 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.197754 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.200007 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.221918 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.240640 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts\") pod \"828472d4-0a60-47d4-906b-107c9cb63417\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.240734 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts\") pod \"4b156df9-8487-4d3c-ae04-53a8ac281484\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.240866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r696t\" (UniqueName: \"kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t\") pod \"4b156df9-8487-4d3c-ae04-53a8ac281484\" (UID: \"4b156df9-8487-4d3c-ae04-53a8ac281484\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.240949 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvhbk\" (UniqueName: \"kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk\") pod \"828472d4-0a60-47d4-906b-107c9cb63417\" (UID: \"828472d4-0a60-47d4-906b-107c9cb63417\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.241731 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "828472d4-0a60-47d4-906b-107c9cb63417" (UID: "828472d4-0a60-47d4-906b-107c9cb63417"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.242504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b156df9-8487-4d3c-ae04-53a8ac281484" (UID: "4b156df9-8487-4d3c-ae04-53a8ac281484"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.247049 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t" (OuterVolumeSpecName: "kube-api-access-r696t") pod "4b156df9-8487-4d3c-ae04-53a8ac281484" (UID: "4b156df9-8487-4d3c-ae04-53a8ac281484"). InnerVolumeSpecName "kube-api-access-r696t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.249807 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk" (OuterVolumeSpecName: "kube-api-access-hvhbk") pod "828472d4-0a60-47d4-906b-107c9cb63417" (UID: "828472d4-0a60-47d4-906b-107c9cb63417"). InnerVolumeSpecName "kube-api-access-hvhbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.277015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bz55" event={"ID":"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122","Type":"ContainerDied","Data":"b1460149953d285259b41b408645d604c6d7babcebd9204034ade8cfe14e6a96"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.277056 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1460149953d285259b41b408645d604c6d7babcebd9204034ade8cfe14e6a96" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.277110 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bz55" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.282810 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-g9l4t" event={"ID":"4b156df9-8487-4d3c-ae04-53a8ac281484","Type":"ContainerDied","Data":"070e1e50bb5c7a8498f6b5e0028c7226973582ea4de1630d83c43cd9133586d2"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.282837 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="070e1e50bb5c7a8498f6b5e0028c7226973582ea4de1630d83c43cd9133586d2" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.282881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-g9l4t" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.285624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-sphs6" event={"ID":"eb6d78be-5cb8-49c0-9cde-4bfae85c513a","Type":"ContainerDied","Data":"78824261779fcbdf56975c254235eabb966506a030182db3226b9b209a3fccdf"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.285646 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78824261779fcbdf56975c254235eabb966506a030182db3226b9b209a3fccdf" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.285685 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-sphs6" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.286927 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c616-account-create-update-x6pww" event={"ID":"a78c3705-cf1f-41d5-b80b-323d167e7cba","Type":"ContainerDied","Data":"04bcd8e4968dc473444ffd2a1432f81990748925a48ae570a2c93c182374157b"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.286949 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04bcd8e4968dc473444ffd2a1432f81990748925a48ae570a2c93c182374157b" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.287069 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c616-account-create-update-x6pww" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.292587 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-l2vv9" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.295330 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9-config-fkb2r" event={"ID":"8739268e-ea69-4e17-a73c-90737c72f040","Type":"ContainerDied","Data":"87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.295376 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87792c4eafef78b260dc9b0e00b0c1ee997746b496b4a610bf9f16407a4fda6a" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.295345 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9-config-fkb2r" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.297142 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c2ls4" event={"ID":"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98","Type":"ContainerStarted","Data":"df775a1093974954fab259790271001b41d634ad321f7517bb39959e4a9fed30"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.300567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4ef6-account-create-update-hgk4l" event={"ID":"828472d4-0a60-47d4-906b-107c9cb63417","Type":"ContainerDied","Data":"f362a948c608b3bac1d2966914eef2df7e6fb1d71bf9530591205f1d0fe8e4bf"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.300675 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f362a948c608b3bac1d2966914eef2df7e6fb1d71bf9530591205f1d0fe8e4bf" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.300798 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4ef6-account-create-update-hgk4l" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.307160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2a5d-account-create-update-7b2vr" event={"ID":"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938","Type":"ContainerDied","Data":"9b705dbe69c592c7b2ed778296b9c631467defc673ebf2cd1bccc3744c0f8daa"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.307205 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b705dbe69c592c7b2ed778296b9c631467defc673ebf2cd1bccc3744c0f8daa" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.307278 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2a5d-account-create-update-7b2vr" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.324063 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6"} Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.342185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts\") pod \"a78c3705-cf1f-41d5-b80b-323d167e7cba\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.342481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kllvg\" (UniqueName: \"kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343334 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55klf\" (UniqueName: \"kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf\") pod \"a78c3705-cf1f-41d5-b80b-323d167e7cba\" (UID: \"a78c3705-cf1f-41d5-b80b-323d167e7cba\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343401 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343507 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts\") pod \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxnnk\" (UniqueName: \"kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk\") pod \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts\") pod \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\" (UID: \"3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts\") pod \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a78c3705-cf1f-41d5-b80b-323d167e7cba" (UID: "a78c3705-cf1f-41d5-b80b-323d167e7cba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.343913 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run" (OuterVolumeSpecName: "var-run") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344318 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" (UID: "3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344551 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" (UID: "cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txgqd\" (UniqueName: \"kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd\") pod \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\" (UID: \"eb6d78be-5cb8-49c0-9cde-4bfae85c513a\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.345115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg4x9\" (UniqueName: \"kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9\") pod \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\" (UID: \"cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.345222 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn\") pod \"8739268e-ea69-4e17-a73c-90737c72f040\" (UID: \"8739268e-ea69-4e17-a73c-90737c72f040\") " Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.344728 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.345690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts" (OuterVolumeSpecName: "scripts") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.346811 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347602 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347672 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r696t\" (UniqueName: \"kubernetes.io/projected/4b156df9-8487-4d3c-ae04-53a8ac281484-kube-api-access-r696t\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347740 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347791 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvhbk\" (UniqueName: \"kubernetes.io/projected/828472d4-0a60-47d4-906b-107c9cb63417-kube-api-access-hvhbk\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347837 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/828472d4-0a60-47d4-906b-107c9cb63417-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347885 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a78c3705-cf1f-41d5-b80b-323d167e7cba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347931 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b156df9-8487-4d3c-ae04-53a8ac281484-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.347985 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.348337 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb6d78be-5cb8-49c0-9cde-4bfae85c513a" (UID: "eb6d78be-5cb8-49c0-9cde-4bfae85c513a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.352599 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-c2ls4" podStartSLOduration=4.981585226 podStartE2EDuration="9.35257893s" podCreationTimestamp="2025-12-04 00:00:28 +0000 UTC" firstStartedPulling="2025-12-04 00:00:32.582748954 +0000 UTC m=+1168.344073365" lastFinishedPulling="2025-12-04 00:00:36.953742658 +0000 UTC m=+1172.715067069" observedRunningTime="2025-12-04 00:00:37.33507464 +0000 UTC m=+1173.096399041" watchObservedRunningTime="2025-12-04 00:00:37.35257893 +0000 UTC m=+1173.113903341" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.356538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg" (OuterVolumeSpecName: "kube-api-access-kllvg") pod "8739268e-ea69-4e17-a73c-90737c72f040" (UID: "8739268e-ea69-4e17-a73c-90737c72f040"). InnerVolumeSpecName "kube-api-access-kllvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.366374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk" (OuterVolumeSpecName: "kube-api-access-fxnnk") pod "3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" (UID: "3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122"). InnerVolumeSpecName "kube-api-access-fxnnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.366595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd" (OuterVolumeSpecName: "kube-api-access-txgqd") pod "eb6d78be-5cb8-49c0-9cde-4bfae85c513a" (UID: "eb6d78be-5cb8-49c0-9cde-4bfae85c513a"). InnerVolumeSpecName "kube-api-access-txgqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.366865 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf" (OuterVolumeSpecName: "kube-api-access-55klf") pod "a78c3705-cf1f-41d5-b80b-323d167e7cba" (UID: "a78c3705-cf1f-41d5-b80b-323d167e7cba"). InnerVolumeSpecName "kube-api-access-55klf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.370003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9" (OuterVolumeSpecName: "kube-api-access-xg4x9") pod "cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" (UID: "cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938"). InnerVolumeSpecName "kube-api-access-xg4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kllvg\" (UniqueName: \"kubernetes.io/projected/8739268e-ea69-4e17-a73c-90737c72f040-kube-api-access-kllvg\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450259 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55klf\" (UniqueName: \"kubernetes.io/projected/a78c3705-cf1f-41d5-b80b-323d167e7cba-kube-api-access-55klf\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450269 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450292 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxnnk\" (UniqueName: \"kubernetes.io/projected/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122-kube-api-access-fxnnk\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450301 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8739268e-ea69-4e17-a73c-90737c72f040-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450309 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450317 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txgqd\" (UniqueName: \"kubernetes.io/projected/eb6d78be-5cb8-49c0-9cde-4bfae85c513a-kube-api-access-txgqd\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450325 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg4x9\" (UniqueName: \"kubernetes.io/projected/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938-kube-api-access-xg4x9\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:37 crc kubenswrapper[4764]: I1204 00:00:37.450333 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8739268e-ea69-4e17-a73c-90737c72f040-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:38 crc kubenswrapper[4764]: I1204 00:00:38.358402 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l2vv9-config-fkb2r"] Dec 04 00:00:38 crc kubenswrapper[4764]: I1204 00:00:38.366988 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l2vv9-config-fkb2r"] Dec 04 00:00:38 crc kubenswrapper[4764]: I1204 00:00:38.375913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80"} Dec 04 00:00:38 crc kubenswrapper[4764]: I1204 00:00:38.555445 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8739268e-ea69-4e17-a73c-90737c72f040" path="/var/lib/kubelet/pods/8739268e-ea69-4e17-a73c-90737c72f040/volumes" Dec 04 00:00:40 crc kubenswrapper[4764]: I1204 00:00:40.397085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6"} Dec 04 00:00:40 crc kubenswrapper[4764]: I1204 00:00:40.397510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf"} Dec 04 00:00:40 crc kubenswrapper[4764]: I1204 00:00:40.397528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe"} Dec 04 00:00:40 crc kubenswrapper[4764]: I1204 00:00:40.397543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8"} Dec 04 00:00:41 crc kubenswrapper[4764]: I1204 00:00:41.428368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9"} Dec 04 00:00:41 crc kubenswrapper[4764]: I1204 00:00:41.437969 4764 generic.go:334] "Generic (PLEG): container finished" podID="ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" containerID="df775a1093974954fab259790271001b41d634ad321f7517bb39959e4a9fed30" exitCode=0 Dec 04 00:00:41 crc kubenswrapper[4764]: I1204 00:00:41.438022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c2ls4" event={"ID":"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98","Type":"ContainerDied","Data":"df775a1093974954fab259790271001b41d634ad321f7517bb39959e4a9fed30"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.454156 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.454536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.454556 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.454572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.454589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566"} Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.753400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.867396 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data\") pod \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.867556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfmpl\" (UniqueName: \"kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl\") pod \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.867598 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle\") pod \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\" (UID: \"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98\") " Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.882355 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl" (OuterVolumeSpecName: "kube-api-access-wfmpl") pod "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" (UID: "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98"). InnerVolumeSpecName "kube-api-access-wfmpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.902459 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" (UID: "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.919043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data" (OuterVolumeSpecName: "config-data") pod "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" (UID: "ccb540b8-d57c-4a9e-ba19-52a0ed22cb98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.969851 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.969886 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfmpl\" (UniqueName: \"kubernetes.io/projected/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-kube-api-access-wfmpl\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:42 crc kubenswrapper[4764]: I1204 00:00:42.969902 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.465640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-c2ls4" event={"ID":"ccb540b8-d57c-4a9e-ba19-52a0ed22cb98","Type":"ContainerDied","Data":"01039d96b0145c5e96c468441bc955c36cbf2b503bb148527ca8cee30dc04f6f"} Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.465968 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01039d96b0145c5e96c468441bc955c36cbf2b503bb148527ca8cee30dc04f6f" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.465810 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-c2ls4" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.482694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerStarted","Data":"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415"} Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.559207 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.457342426 podStartE2EDuration="52.559175423s" podCreationTimestamp="2025-12-03 23:59:51 +0000 UTC" firstStartedPulling="2025-12-04 00:00:33.017422785 +0000 UTC m=+1168.778747196" lastFinishedPulling="2025-12-04 00:00:41.119255762 +0000 UTC m=+1176.880580193" observedRunningTime="2025-12-04 00:00:43.536908057 +0000 UTC m=+1179.298232478" watchObservedRunningTime="2025-12-04 00:00:43.559175423 +0000 UTC m=+1179.320499844" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.699561 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5966d87587-xx7pt"] Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.699931 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b156df9-8487-4d3c-ae04-53a8ac281484" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.699946 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b156df9-8487-4d3c-ae04-53a8ac281484" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.699968 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8739268e-ea69-4e17-a73c-90737c72f040" containerName="ovn-config" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.699974 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8739268e-ea69-4e17-a73c-90737c72f040" containerName="ovn-config" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.699983 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" containerName="keystone-db-sync" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.699988 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" containerName="keystone-db-sync" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.699997 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700002 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.700014 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700019 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.700025 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="828472d4-0a60-47d4-906b-107c9cb63417" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700031 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="828472d4-0a60-47d4-906b-107c9cb63417" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.700040 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb6d78be-5cb8-49c0-9cde-4bfae85c513a" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700045 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6d78be-5cb8-49c0-9cde-4bfae85c513a" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.700065 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a78c3705-cf1f-41d5-b80b-323d167e7cba" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700070 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78c3705-cf1f-41d5-b80b-323d167e7cba" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700211 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a78c3705-cf1f-41d5-b80b-323d167e7cba" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700234 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="828472d4-0a60-47d4-906b-107c9cb63417" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700245 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb6d78be-5cb8-49c0-9cde-4bfae85c513a" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700258 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8739268e-ea69-4e17-a73c-90737c72f040" containerName="ovn-config" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700271 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" containerName="mariadb-account-create-update" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700280 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700289 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b156df9-8487-4d3c-ae04-53a8ac281484" containerName="mariadb-database-create" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.700298 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" containerName="keystone-db-sync" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.701163 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.716427 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fl8sb"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.717766 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.721370 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.721573 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.721748 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.722333 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.722382 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6jnw5" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.729255 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5966d87587-xx7pt"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.747567 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fl8sb"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.882782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khl6j\" (UniqueName: \"kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.882853 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.882896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.882958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.882980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883034 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmn4k\" (UniqueName: \"kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883083 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883102 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883208 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.883262 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.888687 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.891233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.901822 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.902006 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7hd69"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.902938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.903394 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.908690 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7lj2c" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.911741 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.912346 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.913640 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.924882 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7hd69"] Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.933223 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5966d87587-xx7pt"] Dec 04 00:00:43 crc kubenswrapper[4764]: E1204 00:00:43.933767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-qmn4k ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5966d87587-xx7pt" podUID="948b0b90-4d1b-4cfc-b649-720bd7a5b69e" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984603 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984634 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984654 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmn4k\" (UniqueName: \"kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984799 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984888 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.984983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.985026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.985057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjm7r\" (UniqueName: \"kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.985084 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khl6j\" (UniqueName: \"kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.988033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.988043 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.995443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.996202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.996855 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.996863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.996884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:43 crc kubenswrapper[4764]: I1204 00:00:43.997772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.003921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.031770 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f9c88b76f-d7mrk"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.033536 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.036654 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.051687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmn4k\" (UniqueName: \"kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k\") pod \"dnsmasq-dns-5966d87587-xx7pt\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.055837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khl6j\" (UniqueName: \"kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j\") pod \"keystone-bootstrap-fl8sb\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.055953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-rsw7c"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.057422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.063555 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9c88b76f-d7mrk"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.067281 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.067458 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9zbc6" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.067568 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.083950 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rsw7c"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089514 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjm7r\" (UniqueName: \"kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.089854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm7mq\" (UniqueName: \"kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.091683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.095738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.096544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.100666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.101159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.115901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.128532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjm7r\" (UniqueName: \"kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r\") pod \"ceilometer-0\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.151784 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-js842"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.152837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.160291 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fwqqd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.160493 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.160644 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.176776 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dlttd"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.177858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.187100 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.187335 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hcczz" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxz77\" (UniqueName: \"kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193590 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhwcb\" (UniqueName: \"kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193731 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm7mq\" (UniqueName: \"kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193855 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193895 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.193915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.205506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.208349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.208503 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-js842"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.218976 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.237782 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dlttd"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.278249 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9c88b76f-d7mrk"] Dec 04 00:00:44 crc kubenswrapper[4764]: E1204 00:00:44.279009 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-nxz77 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" podUID="f110ed53-cae7-4a1c-b62f-5cbf21158737" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.284202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm7mq\" (UniqueName: \"kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq\") pod \"neutron-db-sync-7hd69\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.301884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk8jc\" (UniqueName: \"kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhwcb\" (UniqueName: \"kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302179 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302414 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxz77\" (UniqueName: \"kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302533 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302604 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302662 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.302806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb7jn\" (UniqueName: \"kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.303371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.309195 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.309283 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.309922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.310692 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.311219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.312542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.327598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.330882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhwcb\" (UniqueName: \"kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.334204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.388933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxz77\" (UniqueName: \"kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77\") pod \"dnsmasq-dns-5f9c88b76f-d7mrk\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.393015 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.394955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts\") pod \"cinder-db-sync-rsw7c\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.409624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk8jc\" (UniqueName: \"kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.409916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410065 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.410207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb7jn\" (UniqueName: \"kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.411185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.427136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.442481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.443200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.448240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.448318 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.448572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.449547 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.478256 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.478384 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.481083 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk8jc\" (UniqueName: \"kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc\") pod \"barbican-db-sync-dlttd\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.487310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb7jn\" (UniqueName: \"kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn\") pod \"placement-db-sync-js842\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.494990 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.497028 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dlttd" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.497606 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.516393 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.541417 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7hd69" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.552498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.612342 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmn4k\" (UniqueName: \"kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k\") pod \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.612670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb\") pod \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.612803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config\") pod \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.612842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb\") pod \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.612917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc\") pod \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\" (UID: \"948b0b90-4d1b-4cfc-b649-720bd7a5b69e\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613178 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58pzk\" (UniqueName: \"kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.613508 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.615538 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "948b0b90-4d1b-4cfc-b649-720bd7a5b69e" (UID: "948b0b90-4d1b-4cfc-b649-720bd7a5b69e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.615569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "948b0b90-4d1b-4cfc-b649-720bd7a5b69e" (UID: "948b0b90-4d1b-4cfc-b649-720bd7a5b69e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.616102 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config" (OuterVolumeSpecName: "config") pod "948b0b90-4d1b-4cfc-b649-720bd7a5b69e" (UID: "948b0b90-4d1b-4cfc-b649-720bd7a5b69e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.616676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "948b0b90-4d1b-4cfc-b649-720bd7a5b69e" (UID: "948b0b90-4d1b-4cfc-b649-720bd7a5b69e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.619799 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k" (OuterVolumeSpecName: "kube-api-access-qmn4k") pod "948b0b90-4d1b-4cfc-b649-720bd7a5b69e" (UID: "948b0b90-4d1b-4cfc-b649-720bd7a5b69e"). InnerVolumeSpecName "kube-api-access-qmn4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxz77\" (UniqueName: \"kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714615 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714821 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.714875 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config\") pod \"f110ed53-cae7-4a1c-b62f-5cbf21158737\" (UID: \"f110ed53-cae7-4a1c-b62f-5cbf21158737\") " Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715181 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715307 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715595 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715622 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715747 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58pzk\" (UniqueName: \"kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715753 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config" (OuterVolumeSpecName: "config") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715874 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715908 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715949 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715959 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715968 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715977 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715985 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.715994 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmn4k\" (UniqueName: \"kubernetes.io/projected/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-kube-api-access-qmn4k\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716003 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716012 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716022 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f110ed53-cae7-4a1c-b62f-5cbf21158737-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716030 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/948b0b90-4d1b-4cfc-b649-720bd7a5b69e-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716371 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.716652 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.717138 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.717343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.717640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.721389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77" (OuterVolumeSpecName: "kube-api-access-nxz77") pod "f110ed53-cae7-4a1c-b62f-5cbf21158737" (UID: "f110ed53-cae7-4a1c-b62f-5cbf21158737"). InnerVolumeSpecName "kube-api-access-nxz77". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.737810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58pzk\" (UniqueName: \"kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk\") pod \"dnsmasq-dns-59bfd87765-ztwrc\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.779090 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-js842" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.815474 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.816806 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxz77\" (UniqueName: \"kubernetes.io/projected/f110ed53-cae7-4a1c-b62f-5cbf21158737-kube-api-access-nxz77\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.948149 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fl8sb"] Dec 04 00:00:44 crc kubenswrapper[4764]: I1204 00:00:44.963418 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.038937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dlttd"] Dec 04 00:00:45 crc kubenswrapper[4764]: W1204 00:00:45.044583 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod959bf4d2_5d71_4a02_a0a0_8c417c2a7d31.slice/crio-e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b WatchSource:0}: Error finding container e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b: Status 404 returned error can't find the container with id e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.049951 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-rsw7c"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.169899 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7hd69"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.326646 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-js842"] Dec 04 00:00:45 crc kubenswrapper[4764]: W1204 00:00:45.338815 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ee4e1b8_616d_469c_988d_f371d65275d9.slice/crio-508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99 WatchSource:0}: Error finding container 508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99: Status 404 returned error can't find the container with id 508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99 Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.380150 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.506395 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" event={"ID":"ff9327f4-311d-47f7-a6c0-2daf84054201","Type":"ContainerStarted","Data":"1f11f34a20afde9032dbfbcdf947e37858048f0ccb9458e010799d636c80a29f"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.507475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-js842" event={"ID":"1ee4e1b8-616d-469c-988d-f371d65275d9","Type":"ContainerStarted","Data":"508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.511251 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl8sb" event={"ID":"c92ef6c0-37d1-4296-b63c-cc63e79081d6","Type":"ContainerStarted","Data":"f956e9fd6be8e36f08c3d2e54ef6651af5346794be3c9e0b2c337f6520a43b6f"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.511283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl8sb" event={"ID":"c92ef6c0-37d1-4296-b63c-cc63e79081d6","Type":"ContainerStarted","Data":"7a3cb009e568a5723441689b260ab1ce2bc44d56a684eef3ac6419c716ab9144"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.516173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dlttd" event={"ID":"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31","Type":"ContainerStarted","Data":"e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.522118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerStarted","Data":"50028f1683db995340d7687fec79b71e9f1d67534d558a062f71f9252d9f34b8"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.525253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7hd69" event={"ID":"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded","Type":"ContainerStarted","Data":"3ea7362b2e46685a8e6f0b8977d5d3a7b0a135c14758f5677b71e9e97057f3b0"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.525297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7hd69" event={"ID":"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded","Type":"ContainerStarted","Data":"c730a1eb0335a0fd6b6fb6d1a2700a6b3c1fb3ae281cca4351c21424269782e6"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.538073 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5966d87587-xx7pt" Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.538342 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9c88b76f-d7mrk" Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.538058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rsw7c" event={"ID":"4312db12-846e-4bc4-8f2f-7121ac50776d","Type":"ContainerStarted","Data":"b04f31726c41ec411a5c7db753e3faf74044e6ddce46edc8c2605ab42f03f9c9"} Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.556588 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fl8sb" podStartSLOduration=2.556569482 podStartE2EDuration="2.556569482s" podCreationTimestamp="2025-12-04 00:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:45.531552679 +0000 UTC m=+1181.292877090" watchObservedRunningTime="2025-12-04 00:00:45.556569482 +0000 UTC m=+1181.317893893" Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.563940 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7hd69" podStartSLOduration=2.563920632 podStartE2EDuration="2.563920632s" podCreationTimestamp="2025-12-04 00:00:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:45.556520151 +0000 UTC m=+1181.317844592" watchObservedRunningTime="2025-12-04 00:00:45.563920632 +0000 UTC m=+1181.325245043" Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.632208 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5966d87587-xx7pt"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.645760 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5966d87587-xx7pt"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.661593 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9c88b76f-d7mrk"] Dec 04 00:00:45 crc kubenswrapper[4764]: I1204 00:00:45.669836 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f9c88b76f-d7mrk"] Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.564382 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerID="cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847" exitCode=0 Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.565429 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948b0b90-4d1b-4cfc-b649-720bd7a5b69e" path="/var/lib/kubelet/pods/948b0b90-4d1b-4cfc-b649-720bd7a5b69e/volumes" Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.565900 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f110ed53-cae7-4a1c-b62f-5cbf21158737" path="/var/lib/kubelet/pods/f110ed53-cae7-4a1c-b62f-5cbf21158737/volumes" Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.566340 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qwhzb" event={"ID":"6c4fa015-bd9b-44c9-a09b-41630154ec52","Type":"ContainerStarted","Data":"479e4e6aa54d02d281f466e2c912ec3552c93b6120c0b7b2a56e584d81dec421"} Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.566374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" event={"ID":"ff9327f4-311d-47f7-a6c0-2daf84054201","Type":"ContainerDied","Data":"cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847"} Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.615262 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-qwhzb" podStartSLOduration=2.9621691180000003 podStartE2EDuration="31.615236726s" podCreationTimestamp="2025-12-04 00:00:15 +0000 UTC" firstStartedPulling="2025-12-04 00:00:16.450107584 +0000 UTC m=+1152.211431995" lastFinishedPulling="2025-12-04 00:00:45.103175192 +0000 UTC m=+1180.864499603" observedRunningTime="2025-12-04 00:00:46.580941614 +0000 UTC m=+1182.342266045" watchObservedRunningTime="2025-12-04 00:00:46.615236726 +0000 UTC m=+1182.376561127" Dec 04 00:00:46 crc kubenswrapper[4764]: I1204 00:00:46.649802 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:00:47 crc kubenswrapper[4764]: I1204 00:00:47.577811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" event={"ID":"ff9327f4-311d-47f7-a6c0-2daf84054201","Type":"ContainerStarted","Data":"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490"} Dec 04 00:00:47 crc kubenswrapper[4764]: I1204 00:00:47.578577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:50 crc kubenswrapper[4764]: I1204 00:00:50.627342 4764 generic.go:334] "Generic (PLEG): container finished" podID="c92ef6c0-37d1-4296-b63c-cc63e79081d6" containerID="f956e9fd6be8e36f08c3d2e54ef6651af5346794be3c9e0b2c337f6520a43b6f" exitCode=0 Dec 04 00:00:50 crc kubenswrapper[4764]: I1204 00:00:50.627422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl8sb" event={"ID":"c92ef6c0-37d1-4296-b63c-cc63e79081d6","Type":"ContainerDied","Data":"f956e9fd6be8e36f08c3d2e54ef6651af5346794be3c9e0b2c337f6520a43b6f"} Dec 04 00:00:50 crc kubenswrapper[4764]: I1204 00:00:50.657405 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" podStartSLOduration=6.657383974 podStartE2EDuration="6.657383974s" podCreationTimestamp="2025-12-04 00:00:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:00:47.59885954 +0000 UTC m=+1183.360183951" watchObservedRunningTime="2025-12-04 00:00:50.657383974 +0000 UTC m=+1186.418708385" Dec 04 00:00:54 crc kubenswrapper[4764]: I1204 00:00:54.817015 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:00:54 crc kubenswrapper[4764]: I1204 00:00:54.894564 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 04 00:00:54 crc kubenswrapper[4764]: I1204 00:00:54.895026 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" containerID="cri-o://46106a5cc93080656e8561cd4ed244f717d4e08eb5d402bb091fea383467d78e" gracePeriod=10 Dec 04 00:00:55 crc kubenswrapper[4764]: I1204 00:00:55.675514 4764 generic.go:334] "Generic (PLEG): container finished" podID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerID="46106a5cc93080656e8561cd4ed244f717d4e08eb5d402bb091fea383467d78e" exitCode=0 Dec 04 00:00:55 crc kubenswrapper[4764]: I1204 00:00:55.675569 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" event={"ID":"5ec747d4-3aa0-4b3c-bb84-13776b506793","Type":"ContainerDied","Data":"46106a5cc93080656e8561cd4ed244f717d4e08eb5d402bb091fea383467d78e"} Dec 04 00:00:55 crc kubenswrapper[4764]: I1204 00:00:55.991490 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.081452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khl6j\" (UniqueName: \"kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.081825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.081928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.081968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.081994 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.082018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys\") pod \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\" (UID: \"c92ef6c0-37d1-4296-b63c-cc63e79081d6\") " Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.087342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.088818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j" (OuterVolumeSpecName: "kube-api-access-khl6j") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "kube-api-access-khl6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.088927 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.092018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts" (OuterVolumeSpecName: "scripts") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.106728 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.108490 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data" (OuterVolumeSpecName: "config-data") pod "c92ef6c0-37d1-4296-b63c-cc63e79081d6" (UID: "c92ef6c0-37d1-4296-b63c-cc63e79081d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183677 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183709 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183773 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khl6j\" (UniqueName: \"kubernetes.io/projected/c92ef6c0-37d1-4296-b63c-cc63e79081d6-kube-api-access-khl6j\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183788 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183798 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.183808 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c92ef6c0-37d1-4296-b63c-cc63e79081d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.685547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fl8sb" event={"ID":"c92ef6c0-37d1-4296-b63c-cc63e79081d6","Type":"ContainerDied","Data":"7a3cb009e568a5723441689b260ab1ce2bc44d56a684eef3ac6419c716ab9144"} Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.685590 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3cb009e568a5723441689b260ab1ce2bc44d56a684eef3ac6419c716ab9144" Dec 04 00:00:56 crc kubenswrapper[4764]: I1204 00:00:56.685594 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fl8sb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.074207 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fl8sb"] Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.081918 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fl8sb"] Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.164749 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kh5nb"] Dec 04 00:00:57 crc kubenswrapper[4764]: E1204 00:00:57.165235 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c92ef6c0-37d1-4296-b63c-cc63e79081d6" containerName="keystone-bootstrap" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.165257 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c92ef6c0-37d1-4296-b63c-cc63e79081d6" containerName="keystone-bootstrap" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.165511 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c92ef6c0-37d1-4296-b63c-cc63e79081d6" containerName="keystone-bootstrap" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.166228 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.168204 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.170014 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6jnw5" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.173958 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kh5nb"] Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.175110 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.175413 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.177260 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.303913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.304332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.304367 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9q28\" (UniqueName: \"kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.304402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.304430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.304507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9q28\" (UniqueName: \"kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406465 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.406500 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.410176 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.410429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.411310 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.413702 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.417647 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.426004 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9q28\" (UniqueName: \"kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28\") pod \"keystone-bootstrap-kh5nb\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:57 crc kubenswrapper[4764]: I1204 00:00:57.494530 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:00:58 crc kubenswrapper[4764]: I1204 00:00:58.556492 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c92ef6c0-37d1-4296-b63c-cc63e79081d6" path="/var/lib/kubelet/pods/c92ef6c0-37d1-4296-b63c-cc63e79081d6/volumes" Dec 04 00:00:59 crc kubenswrapper[4764]: I1204 00:00:59.725799 4764 generic.go:334] "Generic (PLEG): container finished" podID="6c4fa015-bd9b-44c9-a09b-41630154ec52" containerID="479e4e6aa54d02d281f466e2c912ec3552c93b6120c0b7b2a56e584d81dec421" exitCode=0 Dec 04 00:00:59 crc kubenswrapper[4764]: I1204 00:00:59.726884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qwhzb" event={"ID":"6c4fa015-bd9b-44c9-a09b-41630154ec52","Type":"ContainerDied","Data":"479e4e6aa54d02d281f466e2c912ec3552c93b6120c0b7b2a56e584d81dec421"} Dec 04 00:01:01 crc kubenswrapper[4764]: I1204 00:01:01.008781 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.086580 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.097618 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qwhzb" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.209758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb\") pod \"5ec747d4-3aa0-4b3c-bb84-13776b506793\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb\") pod \"5ec747d4-3aa0-4b3c-bb84-13776b506793\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210375 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g85vq\" (UniqueName: \"kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq\") pod \"6c4fa015-bd9b-44c9-a09b-41630154ec52\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210490 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data\") pod \"6c4fa015-bd9b-44c9-a09b-41630154ec52\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llbkq\" (UniqueName: \"kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq\") pod \"5ec747d4-3aa0-4b3c-bb84-13776b506793\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210701 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc\") pod \"5ec747d4-3aa0-4b3c-bb84-13776b506793\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.210896 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle\") pod \"6c4fa015-bd9b-44c9-a09b-41630154ec52\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.211007 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data\") pod \"6c4fa015-bd9b-44c9-a09b-41630154ec52\" (UID: \"6c4fa015-bd9b-44c9-a09b-41630154ec52\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.211120 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config\") pod \"5ec747d4-3aa0-4b3c-bb84-13776b506793\" (UID: \"5ec747d4-3aa0-4b3c-bb84-13776b506793\") " Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.216327 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq" (OuterVolumeSpecName: "kube-api-access-g85vq") pod "6c4fa015-bd9b-44c9-a09b-41630154ec52" (UID: "6c4fa015-bd9b-44c9-a09b-41630154ec52"). InnerVolumeSpecName "kube-api-access-g85vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.217160 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6c4fa015-bd9b-44c9-a09b-41630154ec52" (UID: "6c4fa015-bd9b-44c9-a09b-41630154ec52"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.219165 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq" (OuterVolumeSpecName: "kube-api-access-llbkq") pod "5ec747d4-3aa0-4b3c-bb84-13776b506793" (UID: "5ec747d4-3aa0-4b3c-bb84-13776b506793"). InnerVolumeSpecName "kube-api-access-llbkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.246689 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c4fa015-bd9b-44c9-a09b-41630154ec52" (UID: "6c4fa015-bd9b-44c9-a09b-41630154ec52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.264973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5ec747d4-3aa0-4b3c-bb84-13776b506793" (UID: "5ec747d4-3aa0-4b3c-bb84-13776b506793"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.265396 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5ec747d4-3aa0-4b3c-bb84-13776b506793" (UID: "5ec747d4-3aa0-4b3c-bb84-13776b506793"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.269212 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5ec747d4-3aa0-4b3c-bb84-13776b506793" (UID: "5ec747d4-3aa0-4b3c-bb84-13776b506793"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.270106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config" (OuterVolumeSpecName: "config") pod "5ec747d4-3aa0-4b3c-bb84-13776b506793" (UID: "5ec747d4-3aa0-4b3c-bb84-13776b506793"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.286879 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data" (OuterVolumeSpecName: "config-data") pod "6c4fa015-bd9b-44c9-a09b-41630154ec52" (UID: "6c4fa015-bd9b-44c9-a09b-41630154ec52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313405 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313447 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313457 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313465 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313473 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313481 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g85vq\" (UniqueName: \"kubernetes.io/projected/6c4fa015-bd9b-44c9-a09b-41630154ec52-kube-api-access-g85vq\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313492 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6c4fa015-bd9b-44c9-a09b-41630154ec52-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313527 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llbkq\" (UniqueName: \"kubernetes.io/projected/5ec747d4-3aa0-4b3c-bb84-13776b506793-kube-api-access-llbkq\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.313539 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5ec747d4-3aa0-4b3c-bb84-13776b506793-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.760464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" event={"ID":"5ec747d4-3aa0-4b3c-bb84-13776b506793","Type":"ContainerDied","Data":"54396dbbc47e4ef05fe03261495a4457af6905f89ebe035dfa467bb352ed7eee"} Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.760518 4764 scope.go:117] "RemoveContainer" containerID="46106a5cc93080656e8561cd4ed244f717d4e08eb5d402bb091fea383467d78e" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.760619 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.764049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-qwhzb" event={"ID":"6c4fa015-bd9b-44c9-a09b-41630154ec52","Type":"ContainerDied","Data":"306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509"} Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.764317 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306b8e9a426e9f0aaeed04cba47d545c0a09cbe81dea15f635c9955519f47509" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.764091 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-qwhzb" Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.867154 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 04 00:01:03 crc kubenswrapper[4764]: I1204 00:01:03.874726 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-784d65c867-dnn2v"] Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.231084 4764 scope.go:117] "RemoveContainer" containerID="947902412f86c6c04759d68ebaf69307f99a482182aee5cf097d77911d8d56e7" Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.280621 4764 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.280831 4764 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhwcb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-rsw7c_openstack(4312db12-846e-4bc4-8f2f-7121ac50776d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.282012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-rsw7c" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.578880 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" path="/var/lib/kubelet/pods/5ec747d4-3aa0-4b3c-bb84-13776b506793/volumes" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580184 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.580477 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580495 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.580512 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="init" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580518 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="init" Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.580548 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" containerName="glance-db-sync" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580554 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" containerName="glance-db-sync" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580754 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" containerName="glance-db-sync" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.580775 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.581532 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.581622 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741795 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741828 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.741908 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p4x9\" (UniqueName: \"kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.780280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-js842" event={"ID":"1ee4e1b8-616d-469c-988d-f371d65275d9","Type":"ContainerStarted","Data":"ab31569a4bf1bd2101806486db1ae7ca52640ffa88e43ec4ead8858ebfefecfb"} Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.790418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dlttd" event={"ID":"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31","Type":"ContainerStarted","Data":"5762b0af951828ae8d0afb335be9c60e1432402c4b4fc601d8d05ef67a4d97cf"} Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.794329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerStarted","Data":"76298f4f61a4e278f615ba467c8bd55908c27161f4d0b447d1e0048cf514c457"} Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.799048 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-js842" podStartSLOduration=3.191288344 podStartE2EDuration="20.799028172s" podCreationTimestamp="2025-12-04 00:00:44 +0000 UTC" firstStartedPulling="2025-12-04 00:00:45.342481881 +0000 UTC m=+1181.103806292" lastFinishedPulling="2025-12-04 00:01:02.950221669 +0000 UTC m=+1198.711546120" observedRunningTime="2025-12-04 00:01:04.795566928 +0000 UTC m=+1200.556891339" watchObservedRunningTime="2025-12-04 00:01:04.799028172 +0000 UTC m=+1200.560352583" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.801281 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" containerID="3ea7362b2e46685a8e6f0b8977d5d3a7b0a135c14758f5677b71e9e97057f3b0" exitCode=0 Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.801531 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7hd69" event={"ID":"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded","Type":"ContainerDied","Data":"3ea7362b2e46685a8e6f0b8977d5d3a7b0a135c14758f5677b71e9e97057f3b0"} Dec 04 00:01:04 crc kubenswrapper[4764]: E1204 00:01:04.812655 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-rsw7c" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.818430 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dlttd" podStartSLOduration=1.622470559 podStartE2EDuration="20.818411008s" podCreationTimestamp="2025-12-04 00:00:44 +0000 UTC" firstStartedPulling="2025-12-04 00:00:45.04681948 +0000 UTC m=+1180.808143891" lastFinishedPulling="2025-12-04 00:01:04.242759919 +0000 UTC m=+1200.004084340" observedRunningTime="2025-12-04 00:01:04.812819941 +0000 UTC m=+1200.574144362" watchObservedRunningTime="2025-12-04 00:01:04.818411008 +0000 UTC m=+1200.579735409" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.856893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.857111 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.857150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.857184 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.857485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.857569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p4x9\" (UniqueName: \"kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.860108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.883260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.883319 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kh5nb"] Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.884499 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.884536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.888131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.898073 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 00:01:04 crc kubenswrapper[4764]: I1204 00:01:04.907090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p4x9\" (UniqueName: \"kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9\") pod \"dnsmasq-dns-79464d554c-dqclt\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.201827 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.463949 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.466504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.472977 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.473190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xwh7d" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.473338 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.475840 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.607933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608061 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrfn\" (UniqueName: \"kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608145 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608163 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.608194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.647033 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.685079 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.686985 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.693086 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.715344 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716377 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrfn\" (UniqueName: \"kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716536 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716568 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.716643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.718563 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.718587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.718899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.718941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.723865 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.727414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.756081 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.758380 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrfn\" (UniqueName: \"kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.771437 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.811922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-dqclt" event={"ID":"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5","Type":"ContainerStarted","Data":"222e5467efd4669199b176c094e41fec5361ed231fadf8ba27eabf73d0a2863f"} Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.816765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5nb" event={"ID":"2aa3596a-fac7-4d92-93fc-4d609fb54513","Type":"ContainerStarted","Data":"f1c48e30fd43fed9382b6ed8ffc8d238c4969b51e7d651411bac1bc7845c92ad"} Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.816804 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5nb" event={"ID":"2aa3596a-fac7-4d92-93fc-4d609fb54513","Type":"ContainerStarted","Data":"027a108d4e99c2936b0f379c82fb2b258cf804e9167a3bc6a49cccd86b456cb9"} Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819535 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819639 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819753 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819823 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4qpr\" (UniqueName: \"kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.819860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.829415 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.839877 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kh5nb" podStartSLOduration=8.83985668 podStartE2EDuration="8.83985668s" podCreationTimestamp="2025-12-04 00:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:05.83494395 +0000 UTC m=+1201.596268381" watchObservedRunningTime="2025-12-04 00:01:05.83985668 +0000 UTC m=+1201.601181091" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923480 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4qpr\" (UniqueName: \"kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923735 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.923929 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.925686 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.926155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.926336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.934570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.935339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.936552 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.939408 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4qpr\" (UniqueName: \"kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:05 crc kubenswrapper[4764]: I1204 00:01:05.964483 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.009334 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-784d65c867-dnn2v" podUID="5ec747d4-3aa0-4b3c-bb84-13776b506793" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.012243 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.404036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7hd69" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.534117 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle\") pod \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.534368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config\") pod \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.534564 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm7mq\" (UniqueName: \"kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq\") pod \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\" (UID: \"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded\") " Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.544600 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq" (OuterVolumeSpecName: "kube-api-access-fm7mq") pod "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" (UID: "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded"). InnerVolumeSpecName "kube-api-access-fm7mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.582441 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config" (OuterVolumeSpecName: "config") pod "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" (UID: "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.582620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" (UID: "5f46018a-c0f3-4feb-9a18-d3a8e80d3ded"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.642162 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm7mq\" (UniqueName: \"kubernetes.io/projected/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-kube-api-access-fm7mq\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.642192 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.642208 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.835671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7hd69" event={"ID":"5f46018a-c0f3-4feb-9a18-d3a8e80d3ded","Type":"ContainerDied","Data":"c730a1eb0335a0fd6b6fb6d1a2700a6b3c1fb3ae281cca4351c21424269782e6"} Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.835743 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c730a1eb0335a0fd6b6fb6d1a2700a6b3c1fb3ae281cca4351c21424269782e6" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.835828 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7hd69" Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.849931 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerID="14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a" exitCode=0 Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.850010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-dqclt" event={"ID":"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5","Type":"ContainerDied","Data":"14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a"} Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.854903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerStarted","Data":"969dd782fb7aff3b8ba47ddd55bfe4dfe30112f18b5f24a523613937445114b3"} Dec 04 00:01:06 crc kubenswrapper[4764]: I1204 00:01:06.972077 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.105231 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.116530 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.145685 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:07 crc kubenswrapper[4764]: E1204 00:01:07.146151 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" containerName="neutron-db-sync" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.146177 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" containerName="neutron-db-sync" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.146405 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" containerName="neutron-db-sync" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.148817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.179705 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.246737 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.248409 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.256639 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.256938 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.257084 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7lj2c" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.257276 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.262124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.271316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmttc\" (UniqueName: \"kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.271441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.263623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.271650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.271808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.272160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmttc\" (UniqueName: \"kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374267 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374337 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pc9w\" (UniqueName: \"kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.374375 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.375184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.375401 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.375728 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.381871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.382008 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.396433 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmttc\" (UniqueName: \"kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc\") pod \"dnsmasq-dns-67dfc45497-xqw9z\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.476093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.476183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.476213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pc9w\" (UniqueName: \"kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.476292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.476329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.483366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.483938 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.484135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.486437 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.487235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.501518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pc9w\" (UniqueName: \"kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w\") pod \"neutron-7767dcd5bd-r5prb\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.649265 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.913232 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ee4e1b8-616d-469c-988d-f371d65275d9" containerID="ab31569a4bf1bd2101806486db1ae7ca52640ffa88e43ec4ead8858ebfefecfb" exitCode=0 Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.913422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-js842" event={"ID":"1ee4e1b8-616d-469c-988d-f371d65275d9","Type":"ContainerDied","Data":"ab31569a4bf1bd2101806486db1ae7ca52640ffa88e43ec4ead8858ebfefecfb"} Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.924616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerStarted","Data":"6a7be4004b4e5c32864cace6ad35f674c278ac619e40c4d21e2ed4be3053d51c"} Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.924664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerStarted","Data":"626cda7b43794806bea3f436dffcd49765077b549b9afa66ca1a2eea7d613074"} Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.927029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-dqclt" event={"ID":"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5","Type":"ContainerStarted","Data":"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf"} Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.927191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79464d554c-dqclt" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="dnsmasq-dns" containerID="cri-o://4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf" gracePeriod=10 Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.927471 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.948943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5d8f8f0-4896-4b78-8bac-dffe021bdb58","Type":"ContainerStarted","Data":"e503beff1a35efb9d581e3045466f2b71e6a0a699904916dbe172d5b517f4be4"} Dec 04 00:01:07 crc kubenswrapper[4764]: I1204 00:01:07.963142 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79464d554c-dqclt" podStartSLOduration=3.963120735 podStartE2EDuration="3.963120735s" podCreationTimestamp="2025-12-04 00:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:07.955290063 +0000 UTC m=+1203.716614494" watchObservedRunningTime="2025-12-04 00:01:07.963120735 +0000 UTC m=+1203.724445166" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.021485 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.310012 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:08 crc kubenswrapper[4764]: W1204 00:01:08.329657 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71db3b5f_8617_41ff_b0d2_5734f1941648.slice/crio-1d5091a1324a1252c43f5dfb80e875fa8e87b99798954ca6cfe2cacd0392d5a5 WatchSource:0}: Error finding container 1d5091a1324a1252c43f5dfb80e875fa8e87b99798954ca6cfe2cacd0392d5a5: Status 404 returned error can't find the container with id 1d5091a1324a1252c43f5dfb80e875fa8e87b99798954ca6cfe2cacd0392d5a5 Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.601122 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707104 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p4x9\" (UniqueName: \"kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707155 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707216 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707284 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.707309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc\") pod \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\" (UID: \"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5\") " Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.737528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9" (OuterVolumeSpecName: "kube-api-access-2p4x9") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "kube-api-access-2p4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.811064 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p4x9\" (UniqueName: \"kubernetes.io/projected/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-kube-api-access-2p4x9\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.828141 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.829171 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.842598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config" (OuterVolumeSpecName: "config") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.842685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.857382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" (UID: "ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.912800 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.925853 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.925870 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.925881 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.925891 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.956317 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.972267 4764 generic.go:334] "Generic (PLEG): container finished" podID="959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" containerID="5762b0af951828ae8d0afb335be9c60e1432402c4b4fc601d8d05ef67a4d97cf" exitCode=0 Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.972587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dlttd" event={"ID":"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31","Type":"ContainerDied","Data":"5762b0af951828ae8d0afb335be9c60e1432402c4b4fc601d8d05ef67a4d97cf"} Dec 04 00:01:08 crc kubenswrapper[4764]: I1204 00:01:08.991644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerStarted","Data":"887170f7153697a79795b33f38cbecf3097718338deedcb4765664cf8bf16c21"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.020015 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerID="4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf" exitCode=0 Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.020063 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79464d554c-dqclt" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.020084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-dqclt" event={"ID":"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5","Type":"ContainerDied","Data":"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.020892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79464d554c-dqclt" event={"ID":"ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5","Type":"ContainerDied","Data":"222e5467efd4669199b176c094e41fec5361ed231fadf8ba27eabf73d0a2863f"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.020957 4764 scope.go:117] "RemoveContainer" containerID="4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.026328 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.041062 4764 generic.go:334] "Generic (PLEG): container finished" podID="2aa3596a-fac7-4d92-93fc-4d609fb54513" containerID="f1c48e30fd43fed9382b6ed8ffc8d238c4969b51e7d651411bac1bc7845c92ad" exitCode=0 Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.041138 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5nb" event={"ID":"2aa3596a-fac7-4d92-93fc-4d609fb54513","Type":"ContainerDied","Data":"f1c48e30fd43fed9382b6ed8ffc8d238c4969b51e7d651411bac1bc7845c92ad"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.044523 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.044508478 podStartE2EDuration="5.044508478s" podCreationTimestamp="2025-12-04 00:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:09.032894543 +0000 UTC m=+1204.794218954" watchObservedRunningTime="2025-12-04 00:01:09.044508478 +0000 UTC m=+1204.805832889" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.046577 4764 generic.go:334] "Generic (PLEG): container finished" podID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerID="7a55007fa879780e3161e3c603cb9940175ebdd94d857acec99dfea2a124f7bc" exitCode=0 Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.046633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" event={"ID":"bc53cddb-2106-4e89-836d-11ba8b24ef2c","Type":"ContainerDied","Data":"7a55007fa879780e3161e3c603cb9940175ebdd94d857acec99dfea2a124f7bc"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.046658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" event={"ID":"bc53cddb-2106-4e89-836d-11ba8b24ef2c","Type":"ContainerStarted","Data":"710700ad7f928a79c6d0bebf11b3e9bef7eeda113c2227b4e9b19651cc65a9e8"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.054939 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5d8f8f0-4896-4b78-8bac-dffe021bdb58","Type":"ContainerStarted","Data":"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.055372 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-log" containerID="cri-o://5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" gracePeriod=30 Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.055583 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-httpd" containerID="cri-o://587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" gracePeriod=30 Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.083289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerStarted","Data":"0db85c34c4501b51f6ef73acd76ff0d05d82561b978a0f0f6c72255b04fb1889"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.083336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerStarted","Data":"1d5091a1324a1252c43f5dfb80e875fa8e87b99798954ca6cfe2cacd0392d5a5"} Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.083585 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.085844 4764 scope.go:117] "RemoveContainer" containerID="14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.089376 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.089361438 podStartE2EDuration="5.089361438s" podCreationTimestamp="2025-12-04 00:01:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:09.086874887 +0000 UTC m=+1204.848199298" watchObservedRunningTime="2025-12-04 00:01:09.089361438 +0000 UTC m=+1204.850685849" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.143119 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.159759 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79464d554c-dqclt"] Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.172155 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7767dcd5bd-r5prb" podStartSLOduration=2.172137458 podStartE2EDuration="2.172137458s" podCreationTimestamp="2025-12-04 00:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:09.154507576 +0000 UTC m=+1204.915831987" watchObservedRunningTime="2025-12-04 00:01:09.172137458 +0000 UTC m=+1204.933461869" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.173876 4764 scope.go:117] "RemoveContainer" containerID="4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf" Dec 04 00:01:09 crc kubenswrapper[4764]: E1204 00:01:09.174223 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf\": container with ID starting with 4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf not found: ID does not exist" containerID="4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.174251 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf"} err="failed to get container status \"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf\": rpc error: code = NotFound desc = could not find container \"4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf\": container with ID starting with 4afe7bcbac9b6400c7e0c3c95dfc296b2fc9be9dfd592eafe3358d56becbb0bf not found: ID does not exist" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.174273 4764 scope.go:117] "RemoveContainer" containerID="14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a" Dec 04 00:01:09 crc kubenswrapper[4764]: E1204 00:01:09.174773 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a\": container with ID starting with 14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a not found: ID does not exist" containerID="14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.174802 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a"} err="failed to get container status \"14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a\": rpc error: code = NotFound desc = could not find container \"14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a\": container with ID starting with 14e1ba703ea6162b434cae0e8192c02360c172105c9f38bbfbd0feec3c7f0c6a not found: ID does not exist" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.549314 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-js842" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.558636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle\") pod \"1ee4e1b8-616d-469c-988d-f371d65275d9\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.558774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts\") pod \"1ee4e1b8-616d-469c-988d-f371d65275d9\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.559429 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs\") pod \"1ee4e1b8-616d-469c-988d-f371d65275d9\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.559482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data\") pod \"1ee4e1b8-616d-469c-988d-f371d65275d9\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.559566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb7jn\" (UniqueName: \"kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn\") pod \"1ee4e1b8-616d-469c-988d-f371d65275d9\" (UID: \"1ee4e1b8-616d-469c-988d-f371d65275d9\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.560886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs" (OuterVolumeSpecName: "logs") pod "1ee4e1b8-616d-469c-988d-f371d65275d9" (UID: "1ee4e1b8-616d-469c-988d-f371d65275d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.565886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn" (OuterVolumeSpecName: "kube-api-access-sb7jn") pod "1ee4e1b8-616d-469c-988d-f371d65275d9" (UID: "1ee4e1b8-616d-469c-988d-f371d65275d9"). InnerVolumeSpecName "kube-api-access-sb7jn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.566804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts" (OuterVolumeSpecName: "scripts") pod "1ee4e1b8-616d-469c-988d-f371d65275d9" (UID: "1ee4e1b8-616d-469c-988d-f371d65275d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.604050 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ee4e1b8-616d-469c-988d-f371d65275d9" (UID: "1ee4e1b8-616d-469c-988d-f371d65275d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.644923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data" (OuterVolumeSpecName: "config-data") pod "1ee4e1b8-616d-469c-988d-f371d65275d9" (UID: "1ee4e1b8-616d-469c-988d-f371d65275d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.651966 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662158 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662273 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662322 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4qpr\" (UniqueName: \"kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662353 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662410 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662449 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs\") pod \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\" (UID: \"c5d8f8f0-4896-4b78-8bac-dffe021bdb58\") " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662850 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662862 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ee4e1b8-616d-469c-988d-f371d65275d9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662870 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662879 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb7jn\" (UniqueName: \"kubernetes.io/projected/1ee4e1b8-616d-469c-988d-f371d65275d9-kube-api-access-sb7jn\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.662887 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ee4e1b8-616d-469c-988d-f371d65275d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.663143 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs" (OuterVolumeSpecName: "logs") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.664122 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.673568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr" (OuterVolumeSpecName: "kube-api-access-r4qpr") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "kube-api-access-r4qpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.692864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.696295 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts" (OuterVolumeSpecName: "scripts") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.709281 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.718872 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data" (OuterVolumeSpecName: "config-data") pod "c5d8f8f0-4896-4b78-8bac-dffe021bdb58" (UID: "c5d8f8f0-4896-4b78-8bac-dffe021bdb58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763907 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4qpr\" (UniqueName: \"kubernetes.io/projected/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-kube-api-access-r4qpr\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763942 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763951 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763959 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763967 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.763998 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.764007 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5d8f8f0-4896-4b78-8bac-dffe021bdb58-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.792476 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 00:01:09 crc kubenswrapper[4764]: I1204 00:01:09.865368 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096141 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:01:10 crc kubenswrapper[4764]: E1204 00:01:10.096508 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="dnsmasq-dns" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="dnsmasq-dns" Dec 04 00:01:10 crc kubenswrapper[4764]: E1204 00:01:10.096546 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-httpd" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096553 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-httpd" Dec 04 00:01:10 crc kubenswrapper[4764]: E1204 00:01:10.096561 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee4e1b8-616d-469c-988d-f371d65275d9" containerName="placement-db-sync" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096568 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee4e1b8-616d-469c-988d-f371d65275d9" containerName="placement-db-sync" Dec 04 00:01:10 crc kubenswrapper[4764]: E1204 00:01:10.096582 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-log" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096590 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-log" Dec 04 00:01:10 crc kubenswrapper[4764]: E1204 00:01:10.096605 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="init" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="init" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096791 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee4e1b8-616d-469c-988d-f371d65275d9" containerName="placement-db-sync" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096813 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" containerName="dnsmasq-dns" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096829 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-httpd" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.096838 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerName="glance-log" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.097749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.102056 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.105095 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.105400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-js842" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.105420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-js842" event={"ID":"1ee4e1b8-616d-469c-988d-f371d65275d9","Type":"ContainerDied","Data":"508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.105484 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="508f77551f1384d18dc5998659cb09d12043e85e77e21f543976951ac0729b99" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.110780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.121909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" event={"ID":"bc53cddb-2106-4e89-836d-11ba8b24ef2c","Type":"ContainerStarted","Data":"6d8831f99fe7823d0e1d703eaa4ce6e9ea09330a3359ec9bd2762a39be4cca9e"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.122140 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126301 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerID="5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" exitCode=143 Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126333 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" containerID="587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" exitCode=143 Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126340 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126379 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5d8f8f0-4896-4b78-8bac-dffe021bdb58","Type":"ContainerDied","Data":"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5d8f8f0-4896-4b78-8bac-dffe021bdb58","Type":"ContainerDied","Data":"e503beff1a35efb9d581e3045466f2b71e6a0a699904916dbe172d5b517f4be4"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c5d8f8f0-4896-4b78-8bac-dffe021bdb58","Type":"ContainerDied","Data":"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.126455 4764 scope.go:117] "RemoveContainer" containerID="587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.139960 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-log" containerID="cri-o://6a7be4004b4e5c32864cace6ad35f674c278ac619e40c4d21e2ed4be3053d51c" gracePeriod=30 Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.140727 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerStarted","Data":"b1754d1518a91dca7ffe4af19acf1ff233c9d4a546ae1db616a051e6e998fa04"} Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.140853 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-httpd" containerID="cri-o://887170f7153697a79795b33f38cbecf3097718338deedcb4765664cf8bf16c21" gracePeriod=30 Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171468 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" podStartSLOduration=3.171442417 podStartE2EDuration="3.171442417s" podCreationTimestamp="2025-12-04 00:01:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:10.155739172 +0000 UTC m=+1205.917063583" watchObservedRunningTime="2025-12-04 00:01:10.171442417 +0000 UTC m=+1205.932766828" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-779cq\" (UniqueName: \"kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171683 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.171966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.208418 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.232645 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.276080 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284388 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-779cq\" (UniqueName: \"kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284662 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284752 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284915 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.284981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.286370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.289079 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.289476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.289853 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.309155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.312600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.313290 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-779cq\" (UniqueName: \"kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.313453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.313638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs\") pod \"placement-6d45ff9d86-725zf\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.313703 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400444 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400641 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqch2\" (UniqueName: \"kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400798 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.400847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.420444 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqch2\" (UniqueName: \"kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.502913 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.503746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.503991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.510763 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.510965 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.520364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqch2\" (UniqueName: \"kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.520821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.526207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.538947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.563763 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d8f8f0-4896-4b78-8bac-dffe021bdb58" path="/var/lib/kubelet/pods/c5d8f8f0-4896-4b78-8bac-dffe021bdb58/volumes" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.564402 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5" path="/var/lib/kubelet/pods/ff1b0aaf-ddcf-40c1-b84f-17753b39d2c5/volumes" Dec 04 00:01:10 crc kubenswrapper[4764]: I1204 00:01:10.761911 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.150141 4764 generic.go:334] "Generic (PLEG): container finished" podID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerID="887170f7153697a79795b33f38cbecf3097718338deedcb4765664cf8bf16c21" exitCode=0 Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.150178 4764 generic.go:334] "Generic (PLEG): container finished" podID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerID="6a7be4004b4e5c32864cace6ad35f674c278ac619e40c4d21e2ed4be3053d51c" exitCode=143 Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.150184 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerDied","Data":"887170f7153697a79795b33f38cbecf3097718338deedcb4765664cf8bf16c21"} Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.150230 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerDied","Data":"6a7be4004b4e5c32864cace6ad35f674c278ac619e40c4d21e2ed4be3053d51c"} Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.866060 4764 scope.go:117] "RemoveContainer" containerID="5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.871596 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.893290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.893446 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.899388 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.899784 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942552 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942629 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnz5c\" (UniqueName: \"kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942663 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.942681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.957420 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:01:11 crc kubenswrapper[4764]: I1204 00:01:11.979812 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dlttd" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043386 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9q28\" (UniqueName: \"kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk8jc\" (UniqueName: \"kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc\") pod \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043576 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043603 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043620 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle\") pod \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.043660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys\") pod \"2aa3596a-fac7-4d92-93fc-4d609fb54513\" (UID: \"2aa3596a-fac7-4d92-93fc-4d609fb54513\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data\") pod \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\" (UID: \"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31\") " Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044507 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnz5c\" (UniqueName: \"kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.044642 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.062946 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.068153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.070628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnz5c\" (UniqueName: \"kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.070875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.072129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.076841 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.079048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc" (OuterVolumeSpecName: "kube-api-access-lk8jc") pod "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" (UID: "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31"). InnerVolumeSpecName "kube-api-access-lk8jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.087539 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts" (OuterVolumeSpecName: "scripts") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.090085 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28" (OuterVolumeSpecName: "kube-api-access-x9q28") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "kube-api-access-x9q28". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.098222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.112405 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.112510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" (UID: "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.112809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle\") pod \"neutron-5c96d99869-mwjrh\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.138786 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data" (OuterVolumeSpecName: "config-data") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145798 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145869 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145880 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9q28\" (UniqueName: \"kubernetes.io/projected/2aa3596a-fac7-4d92-93fc-4d609fb54513-kube-api-access-x9q28\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145892 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk8jc\" (UniqueName: \"kubernetes.io/projected/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-kube-api-access-lk8jc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145900 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145908 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.145916 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.155168 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" (UID: "959bf4d2-5d71-4a02-a0a0-8c417c2a7d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.164653 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2aa3596a-fac7-4d92-93fc-4d609fb54513" (UID: "2aa3596a-fac7-4d92-93fc-4d609fb54513"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.165397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dlttd" event={"ID":"959bf4d2-5d71-4a02-a0a0-8c417c2a7d31","Type":"ContainerDied","Data":"e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b"} Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.165520 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e413a9bbbdd58e5f753f35c6c79b30f5fc9ea2dd85433c2d9e289d4c95873a8b" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.165609 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dlttd" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.168128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kh5nb" event={"ID":"2aa3596a-fac7-4d92-93fc-4d609fb54513","Type":"ContainerDied","Data":"027a108d4e99c2936b0f379c82fb2b258cf804e9167a3bc6a49cccd86b456cb9"} Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.168153 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="027a108d4e99c2936b0f379c82fb2b258cf804e9167a3bc6a49cccd86b456cb9" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.168208 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kh5nb" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.248867 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa3596a-fac7-4d92-93fc-4d609fb54513-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.248924 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:12 crc kubenswrapper[4764]: I1204 00:01:12.267563 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.060810 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:01:13 crc kubenswrapper[4764]: E1204 00:01:13.061414 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aa3596a-fac7-4d92-93fc-4d609fb54513" containerName="keystone-bootstrap" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.061429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa3596a-fac7-4d92-93fc-4d609fb54513" containerName="keystone-bootstrap" Dec 04 00:01:13 crc kubenswrapper[4764]: E1204 00:01:13.061443 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" containerName="barbican-db-sync" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.061449 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" containerName="barbican-db-sync" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.061609 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aa3596a-fac7-4d92-93fc-4d609fb54513" containerName="keystone-bootstrap" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.061625 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" containerName="barbican-db-sync" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.062247 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.068570 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.068702 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.068780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6jnw5" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.068827 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.069313 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.069497 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.081225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165345 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165435 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxmv\" (UniqueName: \"kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.165912 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.257087 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.265556 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxmv\" (UniqueName: \"kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.267431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.272192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-hcczz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.272378 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.272434 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.273369 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.274014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.274018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.274761 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.274991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.278428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.282411 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.282615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.284353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.287867 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.299827 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.322831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxmv\" (UniqueName: \"kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv\") pod \"keystone-754f454454-nb48r\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.333907 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369264 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369356 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369401 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzcxx\" (UniqueName: \"kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369478 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnvn\" (UniqueName: \"kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.369544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.382557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.421956 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.422246 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="dnsmasq-dns" containerID="cri-o://6d8831f99fe7823d0e1d703eaa4ce6e9ea09330a3359ec9bd2762a39be4cca9e" gracePeriod=10 Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.455279 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.456818 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470647 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470731 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470771 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzcxx\" (UniqueName: \"kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr7q7\" (UniqueName: \"kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psnvn\" (UniqueName: \"kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.470960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.471023 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.471068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.471133 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.471589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.474642 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475710 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.475887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.480121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.483901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.486825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.489361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnvn\" (UniqueName: \"kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn\") pod \"barbican-keystone-listener-5869cb876-lfmmz\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.489428 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.491139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzcxx\" (UniqueName: \"kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx\") pod \"barbican-worker-54ddd476ff-9v8dj\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.495251 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.496532 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.498143 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.509750 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.576986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577368 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577450 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577468 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr7q7\" (UniqueName: \"kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577497 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqfrb\" (UniqueName: \"kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.577793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.578527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.578620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.579165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.579213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.609564 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr7q7\" (UniqueName: \"kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7\") pod \"dnsmasq-dns-5768d59dd9-ktjvz\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.678875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqfrb\" (UniqueName: \"kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.678974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.679087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.679163 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.679215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.680379 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.683172 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.683389 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.687880 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.698329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqfrb\" (UniqueName: \"kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb\") pod \"barbican-api-7fb6497548-mtn8j\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.735044 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.746552 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.874328 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:13 crc kubenswrapper[4764]: I1204 00:01:13.877577 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:14 crc kubenswrapper[4764]: I1204 00:01:14.217436 4764 generic.go:334] "Generic (PLEG): container finished" podID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerID="6d8831f99fe7823d0e1d703eaa4ce6e9ea09330a3359ec9bd2762a39be4cca9e" exitCode=0 Dec 04 00:01:14 crc kubenswrapper[4764]: I1204 00:01:14.217479 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" event={"ID":"bc53cddb-2106-4e89-836d-11ba8b24ef2c","Type":"ContainerDied","Data":"6d8831f99fe7823d0e1d703eaa4ce6e9ea09330a3359ec9bd2762a39be4cca9e"} Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.790651 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.792248 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.794234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.794867 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.813680 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846474 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrs6s\" (UniqueName: \"kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846513 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.846608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrs6s\" (UniqueName: \"kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948625 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.948793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.949318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.959050 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.959191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.959535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.960123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.962687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:16 crc kubenswrapper[4764]: I1204 00:01:16.980633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrs6s\" (UniqueName: \"kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s\") pod \"barbican-api-789dfd9c8d-k4z96\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:17 crc kubenswrapper[4764]: I1204 00:01:17.136000 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:17 crc kubenswrapper[4764]: I1204 00:01:17.489702 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.015864 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.027064 4764 scope.go:117] "RemoveContainer" containerID="587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" Dec 04 00:01:18 crc kubenswrapper[4764]: E1204 00:01:18.027640 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2\": container with ID starting with 587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2 not found: ID does not exist" containerID="587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.027682 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2"} err="failed to get container status \"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2\": rpc error: code = NotFound desc = could not find container \"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2\": container with ID starting with 587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2 not found: ID does not exist" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.027732 4764 scope.go:117] "RemoveContainer" containerID="5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" Dec 04 00:01:18 crc kubenswrapper[4764]: E1204 00:01:18.028252 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13\": container with ID starting with 5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13 not found: ID does not exist" containerID="5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.028291 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13"} err="failed to get container status \"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13\": rpc error: code = NotFound desc = could not find container \"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13\": container with ID starting with 5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13 not found: ID does not exist" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.028318 4764 scope.go:117] "RemoveContainer" containerID="587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.028655 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2"} err="failed to get container status \"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2\": rpc error: code = NotFound desc = could not find container \"587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2\": container with ID starting with 587a37516fc963e744f9f754c583a71b75443ee21fae6df9c1d53212b50e90e2 not found: ID does not exist" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.028682 4764 scope.go:117] "RemoveContainer" containerID="5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.029147 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13"} err="failed to get container status \"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13\": rpc error: code = NotFound desc = could not find container \"5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13\": container with ID starting with 5c17169523c515b073d2c948c02361474a7c5c5280a746cedaa7efb3f365bd13 not found: ID does not exist" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171054 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171491 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171516 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrfn\" (UniqueName: \"kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171648 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.171769 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts\") pod \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\" (UID: \"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.173197 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs" (OuterVolumeSpecName: "logs") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.173425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.184603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.185096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn" (OuterVolumeSpecName: "kube-api-access-wqrfn") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "kube-api-access-wqrfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.185468 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts" (OuterVolumeSpecName: "scripts") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.211804 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.231541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data" (OuterVolumeSpecName: "config-data") pod "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" (UID: "969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.274556 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f","Type":"ContainerDied","Data":"626cda7b43794806bea3f436dffcd49765077b549b9afa66ca1a2eea7d613074"} Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.274603 4764 scope.go:117] "RemoveContainer" containerID="887170f7153697a79795b33f38cbecf3097718338deedcb4765664cf8bf16c21" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.274689 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276827 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276865 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276874 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276885 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqrfn\" (UniqueName: \"kubernetes.io/projected/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-kube-api-access-wqrfn\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276895 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276904 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.276935 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.315888 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.346145 4764 scope.go:117] "RemoveContainer" containerID="6a7be4004b4e5c32864cace6ad35f674c278ac619e40c4d21e2ed4be3053d51c" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.379214 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.410041 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.437871 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.447067 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:18 crc kubenswrapper[4764]: E1204 00:01:18.447469 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-log" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.447489 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-log" Dec 04 00:01:18 crc kubenswrapper[4764]: E1204 00:01:18.447517 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-httpd" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.447526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-httpd" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.447704 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-log" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.447732 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" containerName="glance-httpd" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.448675 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.454927 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.454969 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.455690 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.466611 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.571187 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f" path="/var/lib/kubelet/pods/969d7d3c-c264-47a5-8bbb-2e3b6c28aa5f/volumes" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584659 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmttc\" (UniqueName: \"kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.584800 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config\") pod \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\" (UID: \"bc53cddb-2106-4e89-836d-11ba8b24ef2c\") " Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585079 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585099 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585174 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4cbm\" (UniqueName: \"kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.585307 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.597627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc" (OuterVolumeSpecName: "kube-api-access-cmttc") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "kube-api-access-cmttc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.640357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.642225 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config" (OuterVolumeSpecName: "config") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.655638 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.663840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.688822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.688902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4cbm\" (UniqueName: \"kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689064 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689126 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689396 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689407 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689417 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689425 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmttc\" (UniqueName: \"kubernetes.io/projected/bc53cddb-2106-4e89-836d-11ba8b24ef2c-kube-api-access-cmttc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.689435 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.690273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.690287 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.690561 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.698634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.699467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.704150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.704928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.708525 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4cbm\" (UniqueName: \"kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.712664 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc53cddb-2106-4e89-836d-11ba8b24ef2c" (UID: "bc53cddb-2106-4e89-836d-11ba8b24ef2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.720160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.790688 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc53cddb-2106-4e89-836d-11ba8b24ef2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.803460 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:18 crc kubenswrapper[4764]: I1204 00:01:18.857080 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.008344 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.145098 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.165908 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.206936 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.218278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.225393 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.250586 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:19 crc kubenswrapper[4764]: W1204 00:01:19.251728 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec3e74e4_e0bc_45a3_a568_c70087b73572.slice/crio-a26e0bd88ec5655e1113239cccc30f9087d02c9584f62b06373cecbc27cdf14e WatchSource:0}: Error finding container a26e0bd88ec5655e1113239cccc30f9087d02c9584f62b06373cecbc27cdf14e: Status 404 returned error can't find the container with id a26e0bd88ec5655e1113239cccc30f9087d02c9584f62b06373cecbc27cdf14e Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.261751 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.295065 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754f454454-nb48r" event={"ID":"803d2331-67a9-462d-9e22-09a112264732","Type":"ContainerStarted","Data":"80b91bd99d8534b3de266d2b1715863f161532c53aa2dac7499707c78064a3ef"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.296088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerStarted","Data":"82cae38fafe86fe46c4d0042fcfce2c20fa3928de79e4db413edba7a96a701a9"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.297006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" event={"ID":"a7c93be5-51de-487a-ab60-4208a2e1a197","Type":"ContainerStarted","Data":"593573587c9019ffc7e331cc0fbaa95bc9dc5dfae13537fc213b1339defc141f"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.298291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerStarted","Data":"a26e0bd88ec5655e1113239cccc30f9087d02c9584f62b06373cecbc27cdf14e"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.312628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerStarted","Data":"f0e896d911d0d02d583e2ed95e40887b3c77cf42d7b2ec1e0701283cb3e7858e"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.318781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerStarted","Data":"1427f26023c6aa4d87589f5fc6f1b2c00b2cb768ac3e4886b2aa2538386f97a8"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.320456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerStarted","Data":"c60cb3e93740d88d05d5b2fbbcc9fc4a6108cc51b9f9d43b26952458f61d1023"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.327876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerStarted","Data":"c80f1c0b8e23174292e6fc7c7bbfc58b19708fa8cd1754335684ab3e339b096d"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.337797 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerStarted","Data":"4ae3b8565825e4094f0a77f86856da19b311f3566dd85383fd1ce3124dbd984a"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.340287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" event={"ID":"bc53cddb-2106-4e89-836d-11ba8b24ef2c","Type":"ContainerDied","Data":"710700ad7f928a79c6d0bebf11b3e9bef7eeda113c2227b4e9b19651cc65a9e8"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.340373 4764 scope.go:117] "RemoveContainer" containerID="6d8831f99fe7823d0e1d703eaa4ce6e9ea09330a3359ec9bd2762a39be4cca9e" Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.340315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dfc45497-xqw9z" Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.343907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerStarted","Data":"8c9f8342e960a6e1256da5bd11f31b19a8aacbde0a18ced4cfe1efc1312ad47e"} Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.392691 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.404125 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dfc45497-xqw9z"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.448856 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:19 crc kubenswrapper[4764]: I1204 00:01:19.458703 4764 scope.go:117] "RemoveContainer" containerID="7a55007fa879780e3161e3c603cb9940175ebdd94d857acec99dfea2a124f7bc" Dec 04 00:01:19 crc kubenswrapper[4764]: W1204 00:01:19.469611 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23043d28_d496_4964_814d_864826992e99.slice/crio-c0a24afd6695e6117a44c686297ed28fa7253684d35b85c9c041da53f951440b WatchSource:0}: Error finding container c0a24afd6695e6117a44c686297ed28fa7253684d35b85c9c041da53f951440b: Status 404 returned error can't find the container with id c0a24afd6695e6117a44c686297ed28fa7253684d35b85c9c041da53f951440b Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.375013 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754f454454-nb48r" event={"ID":"803d2331-67a9-462d-9e22-09a112264732","Type":"ContainerStarted","Data":"5ba3f5a666e85c1ab0ed9cf5640222917b29d19dae5da32e8c3a64bf079caafd"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.375882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.382467 4764 generic.go:334] "Generic (PLEG): container finished" podID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerID="b870eb4592d9f8375bc124dcde52b4b0317e501c49bd82689f2271c1e00a26cb" exitCode=0 Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.382684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" event={"ID":"a7c93be5-51de-487a-ab60-4208a2e1a197","Type":"ContainerDied","Data":"b870eb4592d9f8375bc124dcde52b4b0317e501c49bd82689f2271c1e00a26cb"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.396531 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerStarted","Data":"c0a24afd6695e6117a44c686297ed28fa7253684d35b85c9c041da53f951440b"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.405908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerStarted","Data":"0f1e5405b57025512e61585a9e9a3c74dacc900d7181ee5cacc158e3f86552fc"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.405972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerStarted","Data":"5523cc0c69a6274103cea6cdb99c2b0cb069c2b4434f1a09627b39395825d92d"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.406024 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.406645 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.413857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerStarted","Data":"a21134be3e3715ea13cc50eb4f6afe66e6da39a36610e0241721e4b6a97bd1ba"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.413889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerStarted","Data":"65143ceafc98f4f14c32243bf7a0904444ac2957292e1aa87dd2a07a716ee025"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.413927 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-754f454454-nb48r" podStartSLOduration=7.413906605 podStartE2EDuration="7.413906605s" podCreationTimestamp="2025-12-04 00:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:20.398776114 +0000 UTC m=+1216.160100525" watchObservedRunningTime="2025-12-04 00:01:20.413906605 +0000 UTC m=+1216.175231016" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.414381 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.414408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.419100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerStarted","Data":"f79070c4af81fb3ba806ca5d2c61d64116e1765e388c3c78534b5f9ef1cd7663"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.419142 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerStarted","Data":"c0f14891d1b59f0d4bb85f831e2a4b7f44911359e51183f14fe60719afd8d989"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.419842 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.419881 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.432435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerStarted","Data":"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.436809 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerStarted","Data":"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.436848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerStarted","Data":"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.437582 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.447991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rsw7c" event={"ID":"4312db12-846e-4bc4-8f2f-7121ac50776d","Type":"ContainerStarted","Data":"f135ac215141bc2bfaf1b9e15150725a2cd141128b694948ba130731a69c2c31"} Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.460475 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7fb6497548-mtn8j" podStartSLOduration=7.4604502759999995 podStartE2EDuration="7.460450276s" podCreationTimestamp="2025-12-04 00:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:20.44427542 +0000 UTC m=+1216.205599841" watchObservedRunningTime="2025-12-04 00:01:20.460450276 +0000 UTC m=+1216.221774687" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.474706 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-789dfd9c8d-k4z96" podStartSLOduration=4.474678255 podStartE2EDuration="4.474678255s" podCreationTimestamp="2025-12-04 00:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:20.465103841 +0000 UTC m=+1216.226428262" watchObservedRunningTime="2025-12-04 00:01:20.474678255 +0000 UTC m=+1216.236002666" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.496544 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6d45ff9d86-725zf" podStartSLOduration=10.496525180999999 podStartE2EDuration="10.496525181s" podCreationTimestamp="2025-12-04 00:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:20.487828608 +0000 UTC m=+1216.249153019" watchObservedRunningTime="2025-12-04 00:01:20.496525181 +0000 UTC m=+1216.257849592" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.517013 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-rsw7c" podStartSLOduration=3.546131242 podStartE2EDuration="37.516994863s" podCreationTimestamp="2025-12-04 00:00:43 +0000 UTC" firstStartedPulling="2025-12-04 00:00:45.082067004 +0000 UTC m=+1180.843391415" lastFinishedPulling="2025-12-04 00:01:19.052930625 +0000 UTC m=+1214.814255036" observedRunningTime="2025-12-04 00:01:20.507228174 +0000 UTC m=+1216.268552605" watchObservedRunningTime="2025-12-04 00:01:20.516994863 +0000 UTC m=+1216.278319274" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.533294 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c96d99869-mwjrh" podStartSLOduration=9.533276503 podStartE2EDuration="9.533276503s" podCreationTimestamp="2025-12-04 00:01:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:20.532233237 +0000 UTC m=+1216.293557668" watchObservedRunningTime="2025-12-04 00:01:20.533276503 +0000 UTC m=+1216.294600914" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.563485 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" path="/var/lib/kubelet/pods/bc53cddb-2106-4e89-836d-11ba8b24ef2c/volumes" Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.868577 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:01:20 crc kubenswrapper[4764]: I1204 00:01:20.868884 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:01:21 crc kubenswrapper[4764]: I1204 00:01:21.458757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerStarted","Data":"d3faf2c0271e3a5e12fd1ba63c03d70a48aa65afee12da6c3b480d6626a42d39"} Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.468103 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerStarted","Data":"2dca90cfe591bbbed23db140640a1fbb6727a4ddcded951044149be2f60dd967"} Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.471201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerStarted","Data":"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e"} Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.473426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" event={"ID":"a7c93be5-51de-487a-ab60-4208a2e1a197","Type":"ContainerStarted","Data":"7c17f4b59e264afd8bb3acebd62d79dcda5ef6de0d0e27194f1ae9555fc0c9bc"} Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.490229 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.490211178 podStartE2EDuration="4.490211178s" podCreationTimestamp="2025-12-04 00:01:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:22.487184094 +0000 UTC m=+1218.248508505" watchObservedRunningTime="2025-12-04 00:01:22.490211178 +0000 UTC m=+1218.251535589" Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.515217 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.515196151 podStartE2EDuration="12.515196151s" podCreationTimestamp="2025-12-04 00:01:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:22.509745997 +0000 UTC m=+1218.271070408" watchObservedRunningTime="2025-12-04 00:01:22.515196151 +0000 UTC m=+1218.276520562" Dec 04 00:01:22 crc kubenswrapper[4764]: I1204 00:01:22.538660 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" podStartSLOduration=9.538642336 podStartE2EDuration="9.538642336s" podCreationTimestamp="2025-12-04 00:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:22.528869696 +0000 UTC m=+1218.290194107" watchObservedRunningTime="2025-12-04 00:01:22.538642336 +0000 UTC m=+1218.299966747" Dec 04 00:01:23 crc kubenswrapper[4764]: I1204 00:01:23.483051 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:24 crc kubenswrapper[4764]: I1204 00:01:24.493336 4764 generic.go:334] "Generic (PLEG): container finished" podID="4312db12-846e-4bc4-8f2f-7121ac50776d" containerID="f135ac215141bc2bfaf1b9e15150725a2cd141128b694948ba130731a69c2c31" exitCode=0 Dec 04 00:01:24 crc kubenswrapper[4764]: I1204 00:01:24.493435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rsw7c" event={"ID":"4312db12-846e-4bc4-8f2f-7121ac50776d","Type":"ContainerDied","Data":"f135ac215141bc2bfaf1b9e15150725a2cd141128b694948ba130731a69c2c31"} Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.277984 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.388807 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.388886 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhwcb\" (UniqueName: \"kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.388919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.389002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.389042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.389107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts\") pod \"4312db12-846e-4bc4-8f2f-7121ac50776d\" (UID: \"4312db12-846e-4bc4-8f2f-7121ac50776d\") " Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.389661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.393894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts" (OuterVolumeSpecName: "scripts") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.394009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.394938 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb" (OuterVolumeSpecName: "kube-api-access-dhwcb") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "kube-api-access-dhwcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.439523 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.450895 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.471020 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data" (OuterVolumeSpecName: "config-data") pod "4312db12-846e-4bc4-8f2f-7121ac50776d" (UID: "4312db12-846e-4bc4-8f2f-7121ac50776d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490810 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490840 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhwcb\" (UniqueName: \"kubernetes.io/projected/4312db12-846e-4bc4-8f2f-7121ac50776d-kube-api-access-dhwcb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490849 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4312db12-846e-4bc4-8f2f-7121ac50776d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490858 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490868 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.490878 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4312db12-846e-4bc4-8f2f-7121ac50776d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.533401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerStarted","Data":"da01bc5b68c5d213728724e481b6998adc831fb8b0fab47de60acb606240ad85"} Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.534981 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerStarted","Data":"24fed489eecbcfbdd5a929e01e1c310bcf42cb3baa89d27eff4bf18f0bf16997"} Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.537549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-rsw7c" event={"ID":"4312db12-846e-4bc4-8f2f-7121ac50776d","Type":"ContainerDied","Data":"b04f31726c41ec411a5c7db753e3faf74044e6ddce46edc8c2605ab42f03f9c9"} Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.537566 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b04f31726c41ec411a5c7db753e3faf74044e6ddce46edc8c2605ab42f03f9c9" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.537578 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-rsw7c" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.551201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerStarted","Data":"2645f8b247b7ca910bb95fc7a6db4963d75a645bd0eb478d62d35a9fc934ef32"} Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.551336 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-central-agent" containerID="cri-o://76298f4f61a4e278f615ba467c8bd55908c27161f4d0b447d1e0048cf514c457" gracePeriod=30 Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.551605 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.552515 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="proxy-httpd" containerID="cri-o://2645f8b247b7ca910bb95fc7a6db4963d75a645bd0eb478d62d35a9fc934ef32" gracePeriod=30 Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.552779 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-notification-agent" containerID="cri-o://969dd782fb7aff3b8ba47ddd55bfe4dfe30112f18b5f24a523613937445114b3" gracePeriod=30 Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.552878 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="sg-core" containerID="cri-o://1427f26023c6aa4d87589f5fc6f1b2c00b2cb768ac3e4886b2aa2538386f97a8" gracePeriod=30 Dec 04 00:01:27 crc kubenswrapper[4764]: I1204 00:01:27.586627 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.482399184 podStartE2EDuration="44.586607853s" podCreationTimestamp="2025-12-04 00:00:43 +0000 UTC" firstStartedPulling="2025-12-04 00:00:44.974605149 +0000 UTC m=+1180.735929560" lastFinishedPulling="2025-12-04 00:01:27.078813818 +0000 UTC m=+1222.840138229" observedRunningTime="2025-12-04 00:01:27.580045922 +0000 UTC m=+1223.341370333" watchObservedRunningTime="2025-12-04 00:01:27.586607853 +0000 UTC m=+1223.347932264" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.583047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerStarted","Data":"05700e796c28abd13f4c5635747b2b007e49376c6c97684f76cd88c2347348c6"} Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.589475 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerStarted","Data":"71b9323307a4b2c17ee25a0cbf4f507e0f54dcd057e06354d3d97cbd5b67d385"} Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.592891 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:28 crc kubenswrapper[4764]: E1204 00:01:28.593403 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="dnsmasq-dns" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.593426 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="dnsmasq-dns" Dec 04 00:01:28 crc kubenswrapper[4764]: E1204 00:01:28.593446 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" containerName="cinder-db-sync" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.593462 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" containerName="cinder-db-sync" Dec 04 00:01:28 crc kubenswrapper[4764]: E1204 00:01:28.593483 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="init" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.593490 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="init" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.593705 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" containerName="cinder-db-sync" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.593744 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc53cddb-2106-4e89-836d-11ba8b24ef2c" containerName="dnsmasq-dns" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.594906 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.597193 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.597402 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.597525 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.597680 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9zbc6" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613576 4764 generic.go:334] "Generic (PLEG): container finished" podID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerID="2645f8b247b7ca910bb95fc7a6db4963d75a645bd0eb478d62d35a9fc934ef32" exitCode=0 Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613604 4764 generic.go:334] "Generic (PLEG): container finished" podID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerID="1427f26023c6aa4d87589f5fc6f1b2c00b2cb768ac3e4886b2aa2538386f97a8" exitCode=2 Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613612 4764 generic.go:334] "Generic (PLEG): container finished" podID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerID="76298f4f61a4e278f615ba467c8bd55908c27161f4d0b447d1e0048cf514c457" exitCode=0 Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613629 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerDied","Data":"2645f8b247b7ca910bb95fc7a6db4963d75a645bd0eb478d62d35a9fc934ef32"} Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerDied","Data":"1427f26023c6aa4d87589f5fc6f1b2c00b2cb768ac3e4886b2aa2538386f97a8"} Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.613666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerDied","Data":"76298f4f61a4e278f615ba467c8bd55908c27161f4d0b447d1e0048cf514c457"} Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.637707 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.642760 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-54ddd476ff-9v8dj" podStartSLOduration=7.925198964 podStartE2EDuration="15.642739655s" podCreationTimestamp="2025-12-04 00:01:13 +0000 UTC" firstStartedPulling="2025-12-04 00:01:19.22275155 +0000 UTC m=+1214.984075961" lastFinishedPulling="2025-12-04 00:01:26.940292241 +0000 UTC m=+1222.701616652" observedRunningTime="2025-12-04 00:01:28.607990282 +0000 UTC m=+1224.369314693" watchObservedRunningTime="2025-12-04 00:01:28.642739655 +0000 UTC m=+1224.404064066" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.665156 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.665761 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="dnsmasq-dns" containerID="cri-o://7c17f4b59e264afd8bb3acebd62d79dcda5ef6de0d0e27194f1ae9555fc0c9bc" gracePeriod=10 Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.673840 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.692767 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" podStartSLOduration=8.023431203 podStartE2EDuration="15.692744641s" podCreationTimestamp="2025-12-04 00:01:13 +0000 UTC" firstStartedPulling="2025-12-04 00:01:19.265525519 +0000 UTC m=+1215.026849930" lastFinishedPulling="2025-12-04 00:01:26.934838947 +0000 UTC m=+1222.696163368" observedRunningTime="2025-12-04 00:01:28.648210289 +0000 UTC m=+1224.409534700" watchObservedRunningTime="2025-12-04 00:01:28.692744641 +0000 UTC m=+1224.454069062" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.719932 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.722887 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.723039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.723081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.723206 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.723240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.723362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.725493 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.780760 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.804169 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.804210 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkt4\" (UniqueName: \"kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824616 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824728 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824768 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824787 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.824808 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.827320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.845527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.847807 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.847846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.848235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.852421 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.854844 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v\") pod \"cinder-scheduler-0\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.856570 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.864075 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.864313 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.878062 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.155:5353: connect: connection refused" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.905725 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.915059 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.917966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.932976 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6lhg\" (UniqueName: \"kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934642 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkt4\" (UniqueName: \"kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934680 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934703 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934750 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934821 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.934841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.936278 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.937709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.938673 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.938803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.939440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:28 crc kubenswrapper[4764]: I1204 00:01:28.969255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkt4\" (UniqueName: \"kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4\") pod \"dnsmasq-dns-56d54d44c7-6m772\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036726 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6lhg\" (UniqueName: \"kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036790 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036871 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036398 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.036977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.038600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.040993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.045811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.051529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.051688 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.051755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.065243 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6lhg\" (UniqueName: \"kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg\") pod \"cinder-api-0\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.203088 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.265055 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.265583 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" containerID="cri-o://65143ceafc98f4f14c32243bf7a0904444ac2957292e1aa87dd2a07a716ee025" gracePeriod=30 Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.265995 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" containerID="cri-o://a21134be3e3715ea13cc50eb4f6afe66e6da39a36610e0241721e4b6a97bd1ba" gracePeriod=30 Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.266729 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.280823 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.280990 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.283021 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.283216 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": EOF" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.641561 4764 generic.go:334] "Generic (PLEG): container finished" podID="d73e6508-d60b-4612-a5ab-baa659e58885" containerID="65143ceafc98f4f14c32243bf7a0904444ac2957292e1aa87dd2a07a716ee025" exitCode=143 Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.641630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerDied","Data":"65143ceafc98f4f14c32243bf7a0904444ac2957292e1aa87dd2a07a716ee025"} Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.643525 4764 generic.go:334] "Generic (PLEG): container finished" podID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerID="7c17f4b59e264afd8bb3acebd62d79dcda5ef6de0d0e27194f1ae9555fc0c9bc" exitCode=0 Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.644425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" event={"ID":"a7c93be5-51de-487a-ab60-4208a2e1a197","Type":"ContainerDied","Data":"7c17f4b59e264afd8bb3acebd62d79dcda5ef6de0d0e27194f1ae9555fc0c9bc"} Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.652963 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.653153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.735800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.791584 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:01:29 crc kubenswrapper[4764]: I1204 00:01:29.932196 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.580097 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.670788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" event={"ID":"040441ff-4e5b-4e97-aefa-01ebe3fe0720","Type":"ContainerStarted","Data":"81820c376e808d0e5300bd27780a7cc11f992366f7f1bb7ce55efc27c9ae2a04"} Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.671618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerStarted","Data":"ad7691b22e3bc7cada504715b40c92fd2f9e950355f4a48857fa5dc10f617884"} Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.676269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" event={"ID":"a7c93be5-51de-487a-ab60-4208a2e1a197","Type":"ContainerDied","Data":"593573587c9019ffc7e331cc0fbaa95bc9dc5dfae13537fc213b1339defc141f"} Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.676305 4764 scope.go:117] "RemoveContainer" containerID="7c17f4b59e264afd8bb3acebd62d79dcda5ef6de0d0e27194f1ae9555fc0c9bc" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.676423 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5768d59dd9-ktjvz" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.687884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerStarted","Data":"4964851dc23e22314d5e4f75ba84c5a3b44ffd88c8a2906c9d4654a941760aa2"} Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.694459 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr7q7\" (UniqueName: \"kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.694867 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.694917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.694935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.694958 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.695003 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc\") pod \"a7c93be5-51de-487a-ab60-4208a2e1a197\" (UID: \"a7c93be5-51de-487a-ab60-4208a2e1a197\") " Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.701648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7" (OuterVolumeSpecName: "kube-api-access-lr7q7") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "kube-api-access-lr7q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.733371 4764 scope.go:117] "RemoveContainer" containerID="b870eb4592d9f8375bc124dcde52b4b0317e501c49bd82689f2271c1e00a26cb" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.762169 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.762210 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.770741 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config" (OuterVolumeSpecName: "config") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.781272 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.797758 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.797795 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr7q7\" (UniqueName: \"kubernetes.io/projected/a7c93be5-51de-487a-ab60-4208a2e1a197-kube-api-access-lr7q7\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.797809 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.818125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.818646 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.827074 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a7c93be5-51de-487a-ab60-4208a2e1a197" (UID: "a7c93be5-51de-487a-ab60-4208a2e1a197"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.831281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.844925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.899289 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.899493 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:30 crc kubenswrapper[4764]: I1204 00:01:30.899556 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a7c93be5-51de-487a-ab60-4208a2e1a197-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.236828 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.244574 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5768d59dd9-ktjvz"] Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.721206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerStarted","Data":"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498"} Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.729042 4764 generic.go:334] "Generic (PLEG): container finished" podID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerID="9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b" exitCode=0 Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.729129 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.729138 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.729756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" event={"ID":"040441ff-4e5b-4e97-aefa-01ebe3fe0720","Type":"ContainerDied","Data":"9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b"} Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.730631 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.730873 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:31 crc kubenswrapper[4764]: I1204 00:01:31.738774 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.562463 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" path="/var/lib/kubelet/pods/a7c93be5-51de-487a-ab60-4208a2e1a197/volumes" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.661434 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.662144 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.761701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerStarted","Data":"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f"} Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.761916 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api-log" containerID="cri-o://44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" gracePeriod=30 Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.762023 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api" containerID="cri-o://b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" gracePeriod=30 Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.762351 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.767525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" event={"ID":"040441ff-4e5b-4e97-aefa-01ebe3fe0720","Type":"ContainerStarted","Data":"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412"} Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.767893 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.770946 4764 generic.go:334] "Generic (PLEG): container finished" podID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerID="969dd782fb7aff3b8ba47ddd55bfe4dfe30112f18b5f24a523613937445114b3" exitCode=0 Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.770991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerDied","Data":"969dd782fb7aff3b8ba47ddd55bfe4dfe30112f18b5f24a523613937445114b3"} Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.773558 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerStarted","Data":"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a"} Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.773586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerStarted","Data":"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007"} Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.802580 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.802559738 podStartE2EDuration="4.802559738s" podCreationTimestamp="2025-12-04 00:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:32.777136635 +0000 UTC m=+1228.538461046" watchObservedRunningTime="2025-12-04 00:01:32.802559738 +0000 UTC m=+1228.563884149" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.810231 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" podStartSLOduration=4.810219216 podStartE2EDuration="4.810219216s" podCreationTimestamp="2025-12-04 00:01:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:32.800436636 +0000 UTC m=+1228.561761057" watchObservedRunningTime="2025-12-04 00:01:32.810219216 +0000 UTC m=+1228.571543627" Dec 04 00:01:32 crc kubenswrapper[4764]: I1204 00:01:32.843608 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4740383550000002 podStartE2EDuration="4.843587775s" podCreationTimestamp="2025-12-04 00:01:28 +0000 UTC" firstStartedPulling="2025-12-04 00:01:29.723237325 +0000 UTC m=+1225.484561726" lastFinishedPulling="2025-12-04 00:01:31.092786735 +0000 UTC m=+1226.854111146" observedRunningTime="2025-12-04 00:01:32.824185309 +0000 UTC m=+1228.585509720" watchObservedRunningTime="2025-12-04 00:01:32.843587775 +0000 UTC m=+1228.604912186" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.123879 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.255831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256158 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256275 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.256411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjm7r\" (UniqueName: \"kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r\") pod \"2542d1bd-14ae-4a06-826c-967db5f367b6\" (UID: \"2542d1bd-14ae-4a06-826c-967db5f367b6\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.258119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.258145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.262960 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts" (OuterVolumeSpecName: "scripts") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.283025 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r" (OuterVolumeSpecName: "kube-api-access-vjm7r") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "kube-api-access-vjm7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.309300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.358697 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.358776 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjm7r\" (UniqueName: \"kubernetes.io/projected/2542d1bd-14ae-4a06-826c-967db5f367b6-kube-api-access-vjm7r\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.358793 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.358802 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.358813 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2542d1bd-14ae-4a06-826c-967db5f367b6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.388170 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data" (OuterVolumeSpecName: "config-data") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.400990 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2542d1bd-14ae-4a06-826c-967db5f367b6" (UID: "2542d1bd-14ae-4a06-826c-967db5f367b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.423487 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.460161 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.460202 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2542d1bd-14ae-4a06-826c-967db5f367b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6lhg\" (UniqueName: \"kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561650 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561826 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561890 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.561956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data\") pod \"3b498904-abbf-4cb0-b0e2-ef590017152e\" (UID: \"3b498904-abbf-4cb0-b0e2-ef590017152e\") " Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.562297 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.562698 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs" (OuterVolumeSpecName: "logs") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.563044 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b498904-abbf-4cb0-b0e2-ef590017152e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.563067 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b498904-abbf-4cb0-b0e2-ef590017152e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.571146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts" (OuterVolumeSpecName: "scripts") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.571228 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.578775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg" (OuterVolumeSpecName: "kube-api-access-d6lhg") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "kube-api-access-d6lhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.608674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.627808 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data" (OuterVolumeSpecName: "config-data") pod "3b498904-abbf-4cb0-b0e2-ef590017152e" (UID: "3b498904-abbf-4cb0-b0e2-ef590017152e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.665265 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6lhg\" (UniqueName: \"kubernetes.io/projected/3b498904-abbf-4cb0-b0e2-ef590017152e-kube-api-access-d6lhg\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.665298 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.665308 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.665318 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.665329 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b498904-abbf-4cb0-b0e2-ef590017152e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.781971 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerID="b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" exitCode=0 Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782000 4764 generic.go:334] "Generic (PLEG): container finished" podID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerID="44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" exitCode=143 Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerDied","Data":"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f"} Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerDied","Data":"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498"} Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b498904-abbf-4cb0-b0e2-ef590017152e","Type":"ContainerDied","Data":"4964851dc23e22314d5e4f75ba84c5a3b44ffd88c8a2906c9d4654a941760aa2"} Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782081 4764 scope.go:117] "RemoveContainer" containerID="b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.782180 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.789149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2542d1bd-14ae-4a06-826c-967db5f367b6","Type":"ContainerDied","Data":"50028f1683db995340d7687fec79b71e9f1d67534d558a062f71f9252d9f34b8"} Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.789584 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.789594 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.789831 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.825326 4764 scope.go:117] "RemoveContainer" containerID="44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.844866 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.864213 4764 scope.go:117] "RemoveContainer" containerID="b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.864659 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f\": container with ID starting with b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f not found: ID does not exist" containerID="b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.864690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f"} err="failed to get container status \"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f\": rpc error: code = NotFound desc = could not find container \"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f\": container with ID starting with b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f not found: ID does not exist" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.864730 4764 scope.go:117] "RemoveContainer" containerID="44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.864992 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498\": container with ID starting with 44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498 not found: ID does not exist" containerID="44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865018 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498"} err="failed to get container status \"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498\": rpc error: code = NotFound desc = could not find container \"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498\": container with ID starting with 44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498 not found: ID does not exist" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865035 4764 scope.go:117] "RemoveContainer" containerID="b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865244 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f"} err="failed to get container status \"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f\": rpc error: code = NotFound desc = could not find container \"b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f\": container with ID starting with b8cecf8a6e55dff7c396b9e97eb3f85d057ac37241461ffb80092474ab89f18f not found: ID does not exist" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865266 4764 scope.go:117] "RemoveContainer" containerID="44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865438 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498"} err="failed to get container status \"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498\": rpc error: code = NotFound desc = could not find container \"44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498\": container with ID starting with 44030f0edefdf0ef086b85c547cb3ecf22381b097ecc5c631f27e931dac52498 not found: ID does not exist" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.865461 4764 scope.go:117] "RemoveContainer" containerID="2645f8b247b7ca910bb95fc7a6db4963d75a645bd0eb478d62d35a9fc934ef32" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.870591 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.912546 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.913317 4764 scope.go:117] "RemoveContainer" containerID="1427f26023c6aa4d87589f5fc6f1b2c00b2cb768ac3e4886b2aa2538386f97a8" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.918885 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931328 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931780 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-central-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931799 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-central-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931810 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="proxy-httpd" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931816 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="proxy-httpd" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931833 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="sg-core" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931841 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="sg-core" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api-log" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931861 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api-log" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931875 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="init" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931882 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="init" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931897 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="dnsmasq-dns" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931902 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="dnsmasq-dns" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931923 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api" Dec 04 00:01:33 crc kubenswrapper[4764]: E1204 00:01:33.931944 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-notification-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.931951 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-notification-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932145 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api-log" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932163 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" containerName="cinder-api" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932178 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="sg-core" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932191 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-central-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932214 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c93be5-51de-487a-ab60-4208a2e1a197" containerName="dnsmasq-dns" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932227 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="ceilometer-notification-agent" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.932238 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" containerName="proxy-httpd" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.933372 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.938435 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.941408 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.941619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.945324 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.947464 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.951095 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.957646 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.966423 4764 scope.go:117] "RemoveContainer" containerID="969dd782fb7aff3b8ba47ddd55bfe4dfe30112f18b5f24a523613937445114b3" Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.985486 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:33 crc kubenswrapper[4764]: I1204 00:01:33.995096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.001123 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.001326 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.005787 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.032776 4764 scope.go:117] "RemoveContainer" containerID="76298f4f61a4e278f615ba467c8bd55908c27161f4d0b447d1e0048cf514c457" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjcb\" (UniqueName: \"kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074358 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074385 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074514 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.074777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9xl2\" (UniqueName: \"kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075230 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.075275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.176876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.176976 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9xl2\" (UniqueName: \"kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177144 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177165 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177221 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjcb\" (UniqueName: \"kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177286 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177370 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177386 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.177424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.178054 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.178513 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.178955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.181744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.183204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.185205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.185336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.185561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.185781 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.186359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.186701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.187126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.189580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.195086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9xl2\" (UniqueName: \"kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2\") pod \"ceilometer-0\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.197422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjcb\" (UniqueName: \"kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb\") pod \"cinder-api-0\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.263644 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.367767 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.368142 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.401379 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.558578 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2542d1bd-14ae-4a06-826c-967db5f367b6" path="/var/lib/kubelet/pods/2542d1bd-14ae-4a06-826c-967db5f367b6/volumes" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.560091 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b498904-abbf-4cb0-b0e2-ef590017152e" path="/var/lib/kubelet/pods/3b498904-abbf-4cb0-b0e2-ef590017152e/volumes" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.692367 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:39088->10.217.0.156:9311: read: connection reset by peer" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.692407 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb6497548-mtn8j" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.156:9311/healthcheck\": read tcp 10.217.0.2:39104->10.217.0.156:9311: read: connection reset by peer" Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.722660 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:01:34 crc kubenswrapper[4764]: W1204 00:01:34.768403 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7323df53_27cc_46a0_ad81_1e916db379af.slice/crio-126a699b48bc29e5e1b7670045d2556f1e8160260049cb5acbd2c1cc87cdacd7 WatchSource:0}: Error finding container 126a699b48bc29e5e1b7670045d2556f1e8160260049cb5acbd2c1cc87cdacd7: Status 404 returned error can't find the container with id 126a699b48bc29e5e1b7670045d2556f1e8160260049cb5acbd2c1cc87cdacd7 Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.799697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerStarted","Data":"126a699b48bc29e5e1b7670045d2556f1e8160260049cb5acbd2c1cc87cdacd7"} Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.804256 4764 generic.go:334] "Generic (PLEG): container finished" podID="d73e6508-d60b-4612-a5ab-baa659e58885" containerID="a21134be3e3715ea13cc50eb4f6afe66e6da39a36610e0241721e4b6a97bd1ba" exitCode=0 Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.804313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerDied","Data":"a21134be3e3715ea13cc50eb4f6afe66e6da39a36610e0241721e4b6a97bd1ba"} Dec 04 00:01:34 crc kubenswrapper[4764]: I1204 00:01:34.846903 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.217799 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300249 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle\") pod \"d73e6508-d60b-4612-a5ab-baa659e58885\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300335 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs\") pod \"d73e6508-d60b-4612-a5ab-baa659e58885\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300398 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqfrb\" (UniqueName: \"kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb\") pod \"d73e6508-d60b-4612-a5ab-baa659e58885\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300438 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom\") pod \"d73e6508-d60b-4612-a5ab-baa659e58885\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300469 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data\") pod \"d73e6508-d60b-4612-a5ab-baa659e58885\" (UID: \"d73e6508-d60b-4612-a5ab-baa659e58885\") " Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.300855 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs" (OuterVolumeSpecName: "logs") pod "d73e6508-d60b-4612-a5ab-baa659e58885" (UID: "d73e6508-d60b-4612-a5ab-baa659e58885"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.306588 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d73e6508-d60b-4612-a5ab-baa659e58885" (UID: "d73e6508-d60b-4612-a5ab-baa659e58885"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.307351 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb" (OuterVolumeSpecName: "kube-api-access-fqfrb") pod "d73e6508-d60b-4612-a5ab-baa659e58885" (UID: "d73e6508-d60b-4612-a5ab-baa659e58885"). InnerVolumeSpecName "kube-api-access-fqfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.336112 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d73e6508-d60b-4612-a5ab-baa659e58885" (UID: "d73e6508-d60b-4612-a5ab-baa659e58885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.364315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data" (OuterVolumeSpecName: "config-data") pod "d73e6508-d60b-4612-a5ab-baa659e58885" (UID: "d73e6508-d60b-4612-a5ab-baa659e58885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.401755 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.401945 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d73e6508-d60b-4612-a5ab-baa659e58885-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.402023 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqfrb\" (UniqueName: \"kubernetes.io/projected/d73e6508-d60b-4612-a5ab-baa659e58885-kube-api-access-fqfrb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.402097 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.402150 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d73e6508-d60b-4612-a5ab-baa659e58885-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.816836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerStarted","Data":"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5"} Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.817192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerStarted","Data":"f58740ec5c1a2d5b6fd82e8d1a8e84c125e9bc5999873f2c2f4f18b53fcededb"} Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.821449 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb6497548-mtn8j" event={"ID":"d73e6508-d60b-4612-a5ab-baa659e58885","Type":"ContainerDied","Data":"4ae3b8565825e4094f0a77f86856da19b311f3566dd85383fd1ce3124dbd984a"} Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.821494 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb6497548-mtn8j" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.821501 4764 scope.go:117] "RemoveContainer" containerID="a21134be3e3715ea13cc50eb4f6afe66e6da39a36610e0241721e4b6a97bd1ba" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.834188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerStarted","Data":"4dec83b8647040009bb8b20db48c59cfdae71ee1a3fa1d5ef147201319666a80"} Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.851362 4764 scope.go:117] "RemoveContainer" containerID="65143ceafc98f4f14c32243bf7a0904444ac2957292e1aa87dd2a07a716ee025" Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.858179 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:35 crc kubenswrapper[4764]: I1204 00:01:35.868983 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7fb6497548-mtn8j"] Dec 04 00:01:36 crc kubenswrapper[4764]: I1204 00:01:36.561168 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" path="/var/lib/kubelet/pods/d73e6508-d60b-4612-a5ab-baa659e58885/volumes" Dec 04 00:01:36 crc kubenswrapper[4764]: I1204 00:01:36.845848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerStarted","Data":"6ef4634d4e9a70890a62dc5bc5ec2d0dea18b5551be672ee6677a592a96cead8"} Dec 04 00:01:36 crc kubenswrapper[4764]: I1204 00:01:36.847311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 00:01:36 crc kubenswrapper[4764]: I1204 00:01:36.849384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerStarted","Data":"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c"} Dec 04 00:01:36 crc kubenswrapper[4764]: I1204 00:01:36.871508 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.8714846830000003 podStartE2EDuration="3.871484683s" podCreationTimestamp="2025-12-04 00:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:36.865699141 +0000 UTC m=+1232.627023572" watchObservedRunningTime="2025-12-04 00:01:36.871484683 +0000 UTC m=+1232.632809104" Dec 04 00:01:37 crc kubenswrapper[4764]: I1204 00:01:37.672579 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:37 crc kubenswrapper[4764]: I1204 00:01:37.862864 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerStarted","Data":"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e"} Dec 04 00:01:38 crc kubenswrapper[4764]: I1204 00:01:38.873247 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerStarted","Data":"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649"} Dec 04 00:01:38 crc kubenswrapper[4764]: I1204 00:01:38.929929 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.564413686 podStartE2EDuration="5.929904999s" podCreationTimestamp="2025-12-04 00:01:33 +0000 UTC" firstStartedPulling="2025-12-04 00:01:34.862102261 +0000 UTC m=+1230.623426672" lastFinishedPulling="2025-12-04 00:01:38.227593564 +0000 UTC m=+1233.988917985" observedRunningTime="2025-12-04 00:01:38.922130208 +0000 UTC m=+1234.683454619" watchObservedRunningTime="2025-12-04 00:01:38.929904999 +0000 UTC m=+1234.691229410" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.053193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.107328 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.107546 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="dnsmasq-dns" containerID="cri-o://7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490" gracePeriod=10 Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.169311 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.218194 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.634538 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.683912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.684072 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.684120 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.684149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58pzk\" (UniqueName: \"kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.684286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.684309 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0\") pod \"ff9327f4-311d-47f7-a6c0-2daf84054201\" (UID: \"ff9327f4-311d-47f7-a6c0-2daf84054201\") " Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.696908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk" (OuterVolumeSpecName: "kube-api-access-58pzk") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "kube-api-access-58pzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.740627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.748418 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config" (OuterVolumeSpecName: "config") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.751095 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.755876 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.764953 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ff9327f4-311d-47f7-a6c0-2daf84054201" (UID: "ff9327f4-311d-47f7-a6c0-2daf84054201"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786743 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786770 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786782 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58pzk\" (UniqueName: \"kubernetes.io/projected/ff9327f4-311d-47f7-a6c0-2daf84054201-kube-api-access-58pzk\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786795 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786804 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.786813 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff9327f4-311d-47f7-a6c0-2daf84054201-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887250 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerID="7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490" exitCode=0 Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887319 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" event={"ID":"ff9327f4-311d-47f7-a6c0-2daf84054201","Type":"ContainerDied","Data":"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490"} Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59bfd87765-ztwrc" event={"ID":"ff9327f4-311d-47f7-a6c0-2daf84054201","Type":"ContainerDied","Data":"1f11f34a20afde9032dbfbcdf947e37858048f0ccb9458e010799d636c80a29f"} Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887408 4764 scope.go:117] "RemoveContainer" containerID="7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887883 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.887908 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="cinder-scheduler" containerID="cri-o://1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007" gracePeriod=30 Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.888011 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="probe" containerID="cri-o://1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a" gracePeriod=30 Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.915542 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.920412 4764 scope.go:117] "RemoveContainer" containerID="cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.923786 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59bfd87765-ztwrc"] Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.952555 4764 scope.go:117] "RemoveContainer" containerID="7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490" Dec 04 00:01:39 crc kubenswrapper[4764]: E1204 00:01:39.953462 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490\": container with ID starting with 7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490 not found: ID does not exist" containerID="7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.953490 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490"} err="failed to get container status \"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490\": rpc error: code = NotFound desc = could not find container \"7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490\": container with ID starting with 7e94265b3484ee5d1a5f25fb1db551e108bd98cb47f4aacd800ae1acb1b8a490 not found: ID does not exist" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.953510 4764 scope.go:117] "RemoveContainer" containerID="cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847" Dec 04 00:01:39 crc kubenswrapper[4764]: E1204 00:01:39.954978 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847\": container with ID starting with cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847 not found: ID does not exist" containerID="cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847" Dec 04 00:01:39 crc kubenswrapper[4764]: I1204 00:01:39.955023 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847"} err="failed to get container status \"cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847\": rpc error: code = NotFound desc = could not find container \"cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847\": container with ID starting with cf47dce02bc4d5db713cf28c37de3681befc7975322c2d53d6cdf4d0bbbe8847 not found: ID does not exist" Dec 04 00:01:40 crc kubenswrapper[4764]: I1204 00:01:40.577867 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" path="/var/lib/kubelet/pods/ff9327f4-311d-47f7-a6c0-2daf84054201/volumes" Dec 04 00:01:40 crc kubenswrapper[4764]: I1204 00:01:40.900362 4764 generic.go:334] "Generic (PLEG): container finished" podID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerID="1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a" exitCode=0 Dec 04 00:01:40 crc kubenswrapper[4764]: I1204 00:01:40.900924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerDied","Data":"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a"} Dec 04 00:01:41 crc kubenswrapper[4764]: I1204 00:01:41.454796 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:41 crc kubenswrapper[4764]: I1204 00:01:41.463885 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.286962 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.341183 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.341397 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7767dcd5bd-r5prb" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-api" containerID="cri-o://0db85c34c4501b51f6ef73acd76ff0d05d82561b978a0f0f6c72255b04fb1889" gracePeriod=30 Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.341493 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7767dcd5bd-r5prb" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-httpd" containerID="cri-o://b1754d1518a91dca7ffe4af19acf1ff233c9d4a546ae1db616a051e6e998fa04" gracePeriod=30 Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.920986 4764 generic.go:334] "Generic (PLEG): container finished" podID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerID="b1754d1518a91dca7ffe4af19acf1ff233c9d4a546ae1db616a051e6e998fa04" exitCode=0 Dec 04 00:01:42 crc kubenswrapper[4764]: I1204 00:01:42.921022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerDied","Data":"b1754d1518a91dca7ffe4af19acf1ff233c9d4a546ae1db616a051e6e998fa04"} Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.381947 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.452956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.453000 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.453031 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.453064 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.453093 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.453106 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom\") pod \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\" (UID: \"9c0efd4e-b369-44a7-8a69-41a8f3f92f57\") " Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.454375 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.459251 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v" (OuterVolumeSpecName: "kube-api-access-ncm6v") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "kube-api-access-ncm6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.459609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts" (OuterVolumeSpecName: "scripts") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.476648 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.516902 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.559195 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.559222 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.559232 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.559241 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncm6v\" (UniqueName: \"kubernetes.io/projected/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-kube-api-access-ncm6v\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.559252 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.573846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data" (OuterVolumeSpecName: "config-data") pod "9c0efd4e-b369-44a7-8a69-41a8f3f92f57" (UID: "9c0efd4e-b369-44a7-8a69-41a8f3f92f57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.661104 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c0efd4e-b369-44a7-8a69-41a8f3f92f57-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.931293 4764 generic.go:334] "Generic (PLEG): container finished" podID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerID="1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007" exitCode=0 Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.931339 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerDied","Data":"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007"} Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.931366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9c0efd4e-b369-44a7-8a69-41a8f3f92f57","Type":"ContainerDied","Data":"ad7691b22e3bc7cada504715b40c92fd2f9e950355f4a48857fa5dc10f617884"} Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.931383 4764 scope.go:117] "RemoveContainer" containerID="1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.931404 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.966448 4764 scope.go:117] "RemoveContainer" containerID="1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007" Dec 04 00:01:43 crc kubenswrapper[4764]: I1204 00:01:43.987352 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.003832 4764 scope.go:117] "RemoveContainer" containerID="1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.007986 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a\": container with ID starting with 1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a not found: ID does not exist" containerID="1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.008025 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a"} err="failed to get container status \"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a\": rpc error: code = NotFound desc = could not find container \"1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a\": container with ID starting with 1a4b36eeabef7f7c9daf680fd92a96d963e793c8ca805600284630045b1df24a not found: ID does not exist" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.008049 4764 scope.go:117] "RemoveContainer" containerID="1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.008703 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007\": container with ID starting with 1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007 not found: ID does not exist" containerID="1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.008764 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007"} err="failed to get container status \"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007\": rpc error: code = NotFound desc = could not find container \"1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007\": container with ID starting with 1fb492dd0db82f79434ffd9e357dbefc95ed389fb26fe5962ff702bff21a6007 not found: ID does not exist" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.013027 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.039744 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040167 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="cinder-scheduler" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040182 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="cinder-scheduler" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040197 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="probe" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040202 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="probe" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040216 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="dnsmasq-dns" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040222 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="dnsmasq-dns" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040237 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040243 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040259 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040265 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" Dec 04 00:01:44 crc kubenswrapper[4764]: E1204 00:01:44.040283 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="init" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="init" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040439 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff9327f4-311d-47f7-a6c0-2daf84054201" containerName="dnsmasq-dns" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040454 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api-log" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040468 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="cinder-scheduler" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040481 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73e6508-d60b-4612-a5ab-baa659e58885" containerName="barbican-api" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.040492 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" containerName="probe" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.041401 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.044682 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.048615 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171544 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.171620 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnbm\" (UniqueName: \"kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272785 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272818 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnbm\" (UniqueName: \"kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272838 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.272897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.273007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.277779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.277826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.277835 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.282522 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.313232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnbm\" (UniqueName: \"kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm\") pod \"cinder-scheduler-0\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.362757 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.574819 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0efd4e-b369-44a7-8a69-41a8f3f92f57" path="/var/lib/kubelet/pods/9c0efd4e-b369-44a7-8a69-41a8f3f92f57/volumes" Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.851373 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.977482 4764 generic.go:334] "Generic (PLEG): container finished" podID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerID="0db85c34c4501b51f6ef73acd76ff0d05d82561b978a0f0f6c72255b04fb1889" exitCode=0 Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.977567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerDied","Data":"0db85c34c4501b51f6ef73acd76ff0d05d82561b978a0f0f6c72255b04fb1889"} Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.979774 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerStarted","Data":"89d7833bf6d408563cb0d7a8f6e57e65de5540cc3cf157d12e68b7c5b666161b"} Dec 04 00:01:44 crc kubenswrapper[4764]: I1204 00:01:44.992261 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.092737 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pc9w\" (UniqueName: \"kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w\") pod \"71db3b5f-8617-41ff-b0d2-5734f1941648\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.092807 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs\") pod \"71db3b5f-8617-41ff-b0d2-5734f1941648\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.092903 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config\") pod \"71db3b5f-8617-41ff-b0d2-5734f1941648\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.092966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle\") pod \"71db3b5f-8617-41ff-b0d2-5734f1941648\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.092995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config\") pod \"71db3b5f-8617-41ff-b0d2-5734f1941648\" (UID: \"71db3b5f-8617-41ff-b0d2-5734f1941648\") " Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.098777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w" (OuterVolumeSpecName: "kube-api-access-7pc9w") pod "71db3b5f-8617-41ff-b0d2-5734f1941648" (UID: "71db3b5f-8617-41ff-b0d2-5734f1941648"). InnerVolumeSpecName "kube-api-access-7pc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.100476 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "71db3b5f-8617-41ff-b0d2-5734f1941648" (UID: "71db3b5f-8617-41ff-b0d2-5734f1941648"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.157928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71db3b5f-8617-41ff-b0d2-5734f1941648" (UID: "71db3b5f-8617-41ff-b0d2-5734f1941648"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.163617 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config" (OuterVolumeSpecName: "config") pod "71db3b5f-8617-41ff-b0d2-5734f1941648" (UID: "71db3b5f-8617-41ff-b0d2-5734f1941648"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.165555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "71db3b5f-8617-41ff-b0d2-5734f1941648" (UID: "71db3b5f-8617-41ff-b0d2-5734f1941648"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.196509 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pc9w\" (UniqueName: \"kubernetes.io/projected/71db3b5f-8617-41ff-b0d2-5734f1941648-kube-api-access-7pc9w\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.196545 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.196554 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.196563 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.196571 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/71db3b5f-8617-41ff-b0d2-5734f1941648-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:45 crc kubenswrapper[4764]: I1204 00:01:45.205952 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.007829 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 00:01:46 crc kubenswrapper[4764]: E1204 00:01:46.008582 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-api" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.008594 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-api" Dec 04 00:01:46 crc kubenswrapper[4764]: E1204 00:01:46.008611 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-httpd" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.008618 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-httpd" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.008797 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-httpd" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.008810 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" containerName="neutron-api" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.009375 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.012211 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.012649 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-cm74c" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.012780 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.022016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7767dcd5bd-r5prb" event={"ID":"71db3b5f-8617-41ff-b0d2-5734f1941648","Type":"ContainerDied","Data":"1d5091a1324a1252c43f5dfb80e875fa8e87b99798954ca6cfe2cacd0392d5a5"} Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.022054 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7767dcd5bd-r5prb" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.022079 4764 scope.go:117] "RemoveContainer" containerID="b1754d1518a91dca7ffe4af19acf1ff233c9d4a546ae1db616a051e6e998fa04" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.024605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerStarted","Data":"efeadb11f0ef1de95143824a054c252e1663f1c06d14e2f5070aa509d85dd5bf"} Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.039856 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.069893 4764 scope.go:117] "RemoveContainer" containerID="0db85c34c4501b51f6ef73acd76ff0d05d82561b978a0f0f6c72255b04fb1889" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.070262 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.104834 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7767dcd5bd-r5prb"] Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.115557 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56kpm\" (UniqueName: \"kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.115634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.115665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.115698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.217205 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56kpm\" (UniqueName: \"kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.217488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.217517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.217534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.218410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.222423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.223058 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.244882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56kpm\" (UniqueName: \"kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm\") pod \"openstackclient\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.335079 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.564903 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71db3b5f-8617-41ff-b0d2-5734f1941648" path="/var/lib/kubelet/pods/71db3b5f-8617-41ff-b0d2-5734f1941648/volumes" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.608341 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 00:01:46 crc kubenswrapper[4764]: I1204 00:01:46.833991 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 00:01:46 crc kubenswrapper[4764]: W1204 00:01:46.839577 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8291acae_68d4_4e14_b0a7_40d026ff1cb2.slice/crio-55611cbeb60f17e7b6a032baf912103033a8e9342b8405a5a8f964760572509a WatchSource:0}: Error finding container 55611cbeb60f17e7b6a032baf912103033a8e9342b8405a5a8f964760572509a: Status 404 returned error can't find the container with id 55611cbeb60f17e7b6a032baf912103033a8e9342b8405a5a8f964760572509a Dec 04 00:01:47 crc kubenswrapper[4764]: I1204 00:01:47.039346 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerStarted","Data":"9a0f09120101a0329c1149dc889b69fb1f61190068470f9d33bdfee1ebb1a78d"} Dec 04 00:01:47 crc kubenswrapper[4764]: I1204 00:01:47.042471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8291acae-68d4-4e14-b0a7-40d026ff1cb2","Type":"ContainerStarted","Data":"55611cbeb60f17e7b6a032baf912103033a8e9342b8405a5a8f964760572509a"} Dec 04 00:01:47 crc kubenswrapper[4764]: I1204 00:01:47.070738 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.0707089 podStartE2EDuration="4.0707089s" podCreationTimestamp="2025-12-04 00:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:47.069384737 +0000 UTC m=+1242.830709158" watchObservedRunningTime="2025-12-04 00:01:47.0707089 +0000 UTC m=+1242.832033311" Dec 04 00:01:49 crc kubenswrapper[4764]: I1204 00:01:49.363666 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 00:01:50 crc kubenswrapper[4764]: I1204 00:01:50.868960 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:01:50 crc kubenswrapper[4764]: I1204 00:01:50.869017 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.375462 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.375996 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-log" containerID="cri-o://d3faf2c0271e3a5e12fd1ba63c03d70a48aa65afee12da6c3b480d6626a42d39" gracePeriod=30 Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.376057 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-httpd" containerID="cri-o://2dca90cfe591bbbed23db140640a1fbb6727a4ddcded951044149be2f60dd967" gracePeriod=30 Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.707386 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.709058 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.711579 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.713965 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.719017 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.727780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.822987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrfz\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823354 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823483 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823759 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.823848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrfz\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925671 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925695 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925730 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.925835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.927167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.927185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.933424 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.933536 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.933652 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.934471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.935282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:51 crc kubenswrapper[4764]: I1204 00:01:51.946594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrfz\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz\") pod \"swift-proxy-6677596fcf-6rh2n\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.035034 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.086277 4764 generic.go:334] "Generic (PLEG): container finished" podID="23043d28-d496-4964-814d-864826992e99" containerID="d3faf2c0271e3a5e12fd1ba63c03d70a48aa65afee12da6c3b480d6626a42d39" exitCode=143 Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.086326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerDied","Data":"d3faf2c0271e3a5e12fd1ba63c03d70a48aa65afee12da6c3b480d6626a42d39"} Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.487394 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.487903 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-central-agent" containerID="cri-o://90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5" gracePeriod=30 Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.488526 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="proxy-httpd" containerID="cri-o://dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649" gracePeriod=30 Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.488575 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="sg-core" containerID="cri-o://198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e" gracePeriod=30 Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.488639 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-notification-agent" containerID="cri-o://276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c" gracePeriod=30 Dec 04 00:01:52 crc kubenswrapper[4764]: I1204 00:01:52.501452 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.163:3000/\": EOF" Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119121 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerID="dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649" exitCode=0 Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119149 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerID="198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e" exitCode=2 Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119156 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerID="90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5" exitCode=0 Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerDied","Data":"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649"} Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerDied","Data":"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e"} Dec 04 00:01:53 crc kubenswrapper[4764]: I1204 00:01:53.119210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerDied","Data":"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5"} Dec 04 00:01:54 crc kubenswrapper[4764]: I1204 00:01:54.360222 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:54 crc kubenswrapper[4764]: I1204 00:01:54.360768 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-log" containerID="cri-o://e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e" gracePeriod=30 Dec 04 00:01:54 crc kubenswrapper[4764]: I1204 00:01:54.360875 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-httpd" containerID="cri-o://a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e" gracePeriod=30 Dec 04 00:01:54 crc kubenswrapper[4764]: I1204 00:01:54.628287 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 00:01:55 crc kubenswrapper[4764]: I1204 00:01:55.167986 4764 generic.go:334] "Generic (PLEG): container finished" podID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerID="e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e" exitCode=143 Dec 04 00:01:55 crc kubenswrapper[4764]: I1204 00:01:55.168163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerDied","Data":"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e"} Dec 04 00:01:55 crc kubenswrapper[4764]: I1204 00:01:55.171865 4764 generic.go:334] "Generic (PLEG): container finished" podID="23043d28-d496-4964-814d-864826992e99" containerID="2dca90cfe591bbbed23db140640a1fbb6727a4ddcded951044149be2f60dd967" exitCode=0 Dec 04 00:01:55 crc kubenswrapper[4764]: I1204 00:01:55.171906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerDied","Data":"2dca90cfe591bbbed23db140640a1fbb6727a4ddcded951044149be2f60dd967"} Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.147087 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.198579 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23043d28-d496-4964-814d-864826992e99","Type":"ContainerDied","Data":"c0a24afd6695e6117a44c686297ed28fa7253684d35b85c9c041da53f951440b"} Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.198632 4764 scope.go:117] "RemoveContainer" containerID="2dca90cfe591bbbed23db140640a1fbb6727a4ddcded951044149be2f60dd967" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.198787 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.219650 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:01:57 crc kubenswrapper[4764]: W1204 00:01:57.223832 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62d2ebe6_a49b_4835_bac7_86fbf33bd6c7.slice/crio-9e51ccda65fd06323a1fed7f40332cb10d519bc63c4d26e908da491229a063ed WatchSource:0}: Error finding container 9e51ccda65fd06323a1fed7f40332cb10d519bc63c4d26e908da491229a063ed: Status 404 returned error can't find the container with id 9e51ccda65fd06323a1fed7f40332cb10d519bc63c4d26e908da491229a063ed Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.234611 4764 scope.go:117] "RemoveContainer" containerID="d3faf2c0271e3a5e12fd1ba63c03d70a48aa65afee12da6c3b480d6626a42d39" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.324431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.324493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4cbm\" (UniqueName: \"kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.324523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.324645 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.324988 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.325039 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.325173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.325209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data\") pod \"23043d28-d496-4964-814d-864826992e99\" (UID: \"23043d28-d496-4964-814d-864826992e99\") " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.325777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.325925 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.326009 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs" (OuterVolumeSpecName: "logs") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.330004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.330485 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm" (OuterVolumeSpecName: "kube-api-access-x4cbm") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "kube-api-access-x4cbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.331784 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts" (OuterVolumeSpecName: "scripts") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.358519 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.378541 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data" (OuterVolumeSpecName: "config-data") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.397129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23043d28-d496-4964-814d-864826992e99" (UID: "23043d28-d496-4964-814d-864826992e99"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427533 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427572 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427589 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23043d28-d496-4964-814d-864826992e99-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427601 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427613 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23043d28-d496-4964-814d-864826992e99-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427624 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4cbm\" (UniqueName: \"kubernetes.io/projected/23043d28-d496-4964-814d-864826992e99-kube-api-access-x4cbm\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.427657 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.455587 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.529682 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.774942 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.798214 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.804256 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:57 crc kubenswrapper[4764]: E1204 00:01:57.804694 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-log" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.804724 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-log" Dec 04 00:01:57 crc kubenswrapper[4764]: E1204 00:01:57.804740 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-httpd" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.804746 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-httpd" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.804925 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-log" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.804940 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="23043d28-d496-4964-814d-864826992e99" containerName="glance-httpd" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.805913 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.809541 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.809668 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.818934 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.874595 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.874650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.874705 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.875003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.875050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.875072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lj9m\" (UniqueName: \"kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.875128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.875161 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977494 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lj9m\" (UniqueName: \"kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.977823 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.982155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.982516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.982538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:57 crc kubenswrapper[4764]: I1204 00:01:57.988891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.007449 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lj9m\" (UniqueName: \"kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.021298 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.021769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.030131 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.073443 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.125859 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.127286 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9xl2\" (UniqueName: \"kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180418 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180459 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180529 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180554 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180595 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180732 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.180799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.181412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.181606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqch2\" (UniqueName: \"kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2\") pod \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\" (UID: \"02fa7d42-ad30-474b-98f6-ad1e423af7cc\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.181638 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.181677 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.181772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd\") pod \"d7a88100-3013-41b1-9524-f89f4d4a58ed\" (UID: \"d7a88100-3013-41b1-9524-f89f4d4a58ed\") " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.182354 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.183028 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.183535 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.183916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs" (OuterVolumeSpecName: "logs") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.186518 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2" (OuterVolumeSpecName: "kube-api-access-j9xl2") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "kube-api-access-j9xl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.186540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts" (OuterVolumeSpecName: "scripts") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.187897 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.196887 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts" (OuterVolumeSpecName: "scripts") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.196965 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2" (OuterVolumeSpecName: "kube-api-access-pqch2") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "kube-api-access-pqch2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.248544 4764 generic.go:334] "Generic (PLEG): container finished" podID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerID="a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e" exitCode=0 Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.248613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerDied","Data":"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.248645 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"02fa7d42-ad30-474b-98f6-ad1e423af7cc","Type":"ContainerDied","Data":"8c9f8342e960a6e1256da5bd11f31b19a8aacbde0a18ced4cfe1efc1312ad47e"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.248665 4764 scope.go:117] "RemoveContainer" containerID="a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.248825 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.269403 4764 generic.go:334] "Generic (PLEG): container finished" podID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerID="276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c" exitCode=0 Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.269609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerDied","Data":"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.269635 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d7a88100-3013-41b1-9524-f89f4d4a58ed","Type":"ContainerDied","Data":"f58740ec5c1a2d5b6fd82e8d1a8e84c125e9bc5999873f2c2f4f18b53fcededb"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.269693 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.272321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8291acae-68d4-4e14-b0a7-40d026ff1cb2","Type":"ContainerStarted","Data":"563686178a29a5fa1144486490dc7517a511b76ab0bc8be6b06e40785fc8ba8a"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.280189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.280243 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.280609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.281243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerStarted","Data":"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.281280 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerStarted","Data":"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.281289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerStarted","Data":"9e51ccda65fd06323a1fed7f40332cb10d519bc63c4d26e908da491229a063ed"} Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.281472 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.281512 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287291 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287318 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287328 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02fa7d42-ad30-474b-98f6-ad1e423af7cc-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287336 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqch2\" (UniqueName: \"kubernetes.io/projected/02fa7d42-ad30-474b-98f6-ad1e423af7cc-kube-api-access-pqch2\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287346 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287355 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d7a88100-3013-41b1-9524-f89f4d4a58ed-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287364 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9xl2\" (UniqueName: \"kubernetes.io/projected/d7a88100-3013-41b1-9524-f89f4d4a58ed-kube-api-access-j9xl2\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287374 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287382 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287390 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.287400 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.308185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data" (OuterVolumeSpecName: "config-data") pod "02fa7d42-ad30-474b-98f6-ad1e423af7cc" (UID: "02fa7d42-ad30-474b-98f6-ad1e423af7cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.320675 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.327367 4764 scope.go:117] "RemoveContainer" containerID="e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.336431 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6677596fcf-6rh2n" podStartSLOduration=7.336412584 podStartE2EDuration="7.336412584s" podCreationTimestamp="2025-12-04 00:01:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:01:58.314740402 +0000 UTC m=+1254.076064813" watchObservedRunningTime="2025-12-04 00:01:58.336412584 +0000 UTC m=+1254.097736995" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.337132 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.166008303 podStartE2EDuration="13.337126281s" podCreationTimestamp="2025-12-04 00:01:45 +0000 UTC" firstStartedPulling="2025-12-04 00:01:46.842093273 +0000 UTC m=+1242.603417684" lastFinishedPulling="2025-12-04 00:01:57.013211251 +0000 UTC m=+1252.774535662" observedRunningTime="2025-12-04 00:01:58.295355497 +0000 UTC m=+1254.056679908" watchObservedRunningTime="2025-12-04 00:01:58.337126281 +0000 UTC m=+1254.098450692" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.339561 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.355941 4764 scope.go:117] "RemoveContainer" containerID="a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.356452 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e\": container with ID starting with a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e not found: ID does not exist" containerID="a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.356501 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e"} err="failed to get container status \"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e\": rpc error: code = NotFound desc = could not find container \"a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e\": container with ID starting with a6fc2ce94d0523c4875c65cfacd8a08a98b2af68d98c5a1f7ca259f58b8dfd6e not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.356521 4764 scope.go:117] "RemoveContainer" containerID="e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.356831 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e\": container with ID starting with e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e not found: ID does not exist" containerID="e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.356849 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e"} err="failed to get container status \"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e\": rpc error: code = NotFound desc = could not find container \"e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e\": container with ID starting with e29f306030360d22097e217f1094a039a727ee6c2a2af43141a9054f66b3449e not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.356882 4764 scope.go:117] "RemoveContainer" containerID="dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.389131 4764 scope.go:117] "RemoveContainer" containerID="198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.389519 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02fa7d42-ad30-474b-98f6-ad1e423af7cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.389552 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.389565 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.397288 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data" (OuterVolumeSpecName: "config-data") pod "d7a88100-3013-41b1-9524-f89f4d4a58ed" (UID: "d7a88100-3013-41b1-9524-f89f4d4a58ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.420549 4764 scope.go:117] "RemoveContainer" containerID="276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.481434 4764 scope.go:117] "RemoveContainer" containerID="90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.491951 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7a88100-3013-41b1-9524-f89f4d4a58ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.560897 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23043d28-d496-4964-814d-864826992e99" path="/var/lib/kubelet/pods/23043d28-d496-4964-814d-864826992e99/volumes" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.591004 4764 scope.go:117] "RemoveContainer" containerID="dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.592529 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649\": container with ID starting with dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649 not found: ID does not exist" containerID="dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.592561 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649"} err="failed to get container status \"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649\": rpc error: code = NotFound desc = could not find container \"dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649\": container with ID starting with dbf659b0e183f71151661b35c2bfa612964cee7abeb10b426a5d9d55caee7649 not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.592585 4764 scope.go:117] "RemoveContainer" containerID="198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.598091 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e\": container with ID starting with 198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e not found: ID does not exist" containerID="198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.598146 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e"} err="failed to get container status \"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e\": rpc error: code = NotFound desc = could not find container \"198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e\": container with ID starting with 198a0718b22df0814a0807e7ade3a1245919339bfec3742f034bcbeecfed404e not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.598175 4764 scope.go:117] "RemoveContainer" containerID="276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.600528 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c\": container with ID starting with 276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c not found: ID does not exist" containerID="276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.600584 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c"} err="failed to get container status \"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c\": rpc error: code = NotFound desc = could not find container \"276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c\": container with ID starting with 276790a85c30079b8a4a4ebbefe9cd23e780e19047e36656fc624885e0a3218c not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.600611 4764 scope.go:117] "RemoveContainer" containerID="90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.601885 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5\": container with ID starting with 90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5 not found: ID does not exist" containerID="90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.601943 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5"} err="failed to get container status \"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5\": rpc error: code = NotFound desc = could not find container \"90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5\": container with ID starting with 90878d7a800ec1127c951a926ed8f827415ce99bb767ddd809eacb72c784a9b5 not found: ID does not exist" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.615877 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.650828 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.684408 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.684960 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="proxy-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.684971 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="proxy-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.684989 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-log" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.684995 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-log" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.685013 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685020 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.685031 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="sg-core" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685036 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="sg-core" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.685048 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-central-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685053 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-central-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: E1204 00:01:58.685072 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-notification-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685078 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-notification-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685231 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-notification-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685250 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="ceilometer-central-agent" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685257 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685267 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" containerName="glance-log" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685273 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="proxy-httpd" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.685284 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" containerName="sg-core" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.686128 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.689255 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.689511 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.721073 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.746147 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.768028 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.798420 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.801146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.801274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802643 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802734 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.802841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.807341 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.807498 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.895431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rkv\" (UniqueName: \"kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904827 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904910 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.904985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905468 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905644 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905669 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.905707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: W1204 00:01:58.908667 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6de1323_46ca_460b_8a8f_620125ce1d7f.slice/crio-3e3a3f739ffdc57403735e8733cadc11c36b2eaaf8f4279757a54059c4482da6 WatchSource:0}: Error finding container 3e3a3f739ffdc57403735e8733cadc11c36b2eaaf8f4279757a54059c4482da6: Status 404 returned error can't find the container with id 3e3a3f739ffdc57403735e8733cadc11c36b2eaaf8f4279757a54059c4482da6 Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.911689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.912212 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.912562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.913458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.922514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:58 crc kubenswrapper[4764]: I1204 00:01:58.934881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " pod="openstack/glance-default-internal-api-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.017841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.017967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.018012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.018050 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.018095 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rkv\" (UniqueName: \"kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.018167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.018234 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.021669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.021975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.022773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.026810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.038122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.040356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.043019 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rkv\" (UniqueName: \"kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv\") pod \"ceilometer-0\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.047153 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.124352 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.304408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerStarted","Data":"3e3a3f739ffdc57403735e8733cadc11c36b2eaaf8f4279757a54059c4482da6"} Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.719362 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:01:59 crc kubenswrapper[4764]: W1204 00:01:59.719514 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef3ecde_294a_410a_ba90_d08a00674b9f.slice/crio-ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb WatchSource:0}: Error finding container ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb: Status 404 returned error can't find the container with id ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb Dec 04 00:01:59 crc kubenswrapper[4764]: I1204 00:01:59.804841 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:01:59 crc kubenswrapper[4764]: W1204 00:01:59.816359 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5629c18_5c10_48be_9321_a7cc5e581935.slice/crio-face92f86eed5704bafd7b1c252522e063e44ecb4a9c14bdf7242ccc21d2f188 WatchSource:0}: Error finding container face92f86eed5704bafd7b1c252522e063e44ecb4a9c14bdf7242ccc21d2f188: Status 404 returned error can't find the container with id face92f86eed5704bafd7b1c252522e063e44ecb4a9c14bdf7242ccc21d2f188 Dec 04 00:02:00 crc kubenswrapper[4764]: I1204 00:02:00.320017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerStarted","Data":"face92f86eed5704bafd7b1c252522e063e44ecb4a9c14bdf7242ccc21d2f188"} Dec 04 00:02:00 crc kubenswrapper[4764]: I1204 00:02:00.321296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerStarted","Data":"ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb"} Dec 04 00:02:00 crc kubenswrapper[4764]: I1204 00:02:00.322360 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerStarted","Data":"4d9413685b93b99db1f424005b4c8e8740303642c0922ed2d65d664c8191da2b"} Dec 04 00:02:00 crc kubenswrapper[4764]: I1204 00:02:00.557695 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02fa7d42-ad30-474b-98f6-ad1e423af7cc" path="/var/lib/kubelet/pods/02fa7d42-ad30-474b-98f6-ad1e423af7cc/volumes" Dec 04 00:02:00 crc kubenswrapper[4764]: I1204 00:02:00.558435 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a88100-3013-41b1-9524-f89f4d4a58ed" path="/var/lib/kubelet/pods/d7a88100-3013-41b1-9524-f89f4d4a58ed/volumes" Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.332684 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerStarted","Data":"9063bab53ed33482cd89acc542fa22e753f09e7ceb951b53766f7dc55ea3fcda"} Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.335178 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerStarted","Data":"832a3f63b288851f0f39ca5969bec51f3da1b190c67dd7509a50ece13698d307"} Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.336971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerStarted","Data":"94f6b973870b4eea1c81fd05faef27612f6926a743b37cea298cec8baf77cde6"} Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.337019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerStarted","Data":"2398fc991b15ae90f82ce48cda9c9af81acdb7bfc25b1505a825f6d849c9810b"} Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.397952 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.397930851 podStartE2EDuration="4.397930851s" podCreationTimestamp="2025-12-04 00:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:01.368632292 +0000 UTC m=+1257.129956703" watchObservedRunningTime="2025-12-04 00:02:01.397930851 +0000 UTC m=+1257.159255262" Dec 04 00:02:01 crc kubenswrapper[4764]: I1204 00:02:01.998378 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.998358767 podStartE2EDuration="3.998358767s" podCreationTimestamp="2025-12-04 00:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:01.404974774 +0000 UTC m=+1257.166299185" watchObservedRunningTime="2025-12-04 00:02:01.998358767 +0000 UTC m=+1257.759683178" Dec 04 00:02:02 crc kubenswrapper[4764]: I1204 00:02:02.006851 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:02 crc kubenswrapper[4764]: I1204 00:02:02.052671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:02:02 crc kubenswrapper[4764]: I1204 00:02:02.062532 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:02:02 crc kubenswrapper[4764]: I1204 00:02:02.363610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerStarted","Data":"85641260207dbefbcac874908167fd07d8cfeb6a940dc58f5b8ec083fc859674"} Dec 04 00:02:03 crc kubenswrapper[4764]: I1204 00:02:03.377469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerStarted","Data":"df97c5c023edc3f845d2990204fedcebfaf0a624b684e1c0aec3cd81bf8476f0"} Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.389429 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerStarted","Data":"2a93c486c57189f55f3f1d336829763ddab47dc89ff03482fb6705757bee8d8b"} Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.390083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.389964 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="proxy-httpd" containerID="cri-o://2a93c486c57189f55f3f1d336829763ddab47dc89ff03482fb6705757bee8d8b" gracePeriod=30 Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.389590 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-central-agent" containerID="cri-o://832a3f63b288851f0f39ca5969bec51f3da1b190c67dd7509a50ece13698d307" gracePeriod=30 Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.390024 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="sg-core" containerID="cri-o://df97c5c023edc3f845d2990204fedcebfaf0a624b684e1c0aec3cd81bf8476f0" gracePeriod=30 Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.390015 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-notification-agent" containerID="cri-o://85641260207dbefbcac874908167fd07d8cfeb6a940dc58f5b8ec083fc859674" gracePeriod=30 Dec 04 00:02:04 crc kubenswrapper[4764]: I1204 00:02:04.416968 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.992676911 podStartE2EDuration="6.416952015s" podCreationTimestamp="2025-12-04 00:01:58 +0000 UTC" firstStartedPulling="2025-12-04 00:01:59.87948771 +0000 UTC m=+1255.640812121" lastFinishedPulling="2025-12-04 00:02:03.303762824 +0000 UTC m=+1259.065087225" observedRunningTime="2025-12-04 00:02:04.409851471 +0000 UTC m=+1260.171175892" watchObservedRunningTime="2025-12-04 00:02:04.416952015 +0000 UTC m=+1260.178276426" Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.412700 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5629c18-5c10-48be-9321-a7cc5e581935" containerID="2a93c486c57189f55f3f1d336829763ddab47dc89ff03482fb6705757bee8d8b" exitCode=0 Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.413163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerDied","Data":"2a93c486c57189f55f3f1d336829763ddab47dc89ff03482fb6705757bee8d8b"} Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.413214 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerDied","Data":"df97c5c023edc3f845d2990204fedcebfaf0a624b684e1c0aec3cd81bf8476f0"} Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.413181 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5629c18-5c10-48be-9321-a7cc5e581935" containerID="df97c5c023edc3f845d2990204fedcebfaf0a624b684e1c0aec3cd81bf8476f0" exitCode=2 Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.413246 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5629c18-5c10-48be-9321-a7cc5e581935" containerID="85641260207dbefbcac874908167fd07d8cfeb6a940dc58f5b8ec083fc859674" exitCode=0 Dec 04 00:02:05 crc kubenswrapper[4764]: I1204 00:02:05.413269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerDied","Data":"85641260207dbefbcac874908167fd07d8cfeb6a940dc58f5b8ec083fc859674"} Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.423014 4764 generic.go:334] "Generic (PLEG): container finished" podID="d5629c18-5c10-48be-9321-a7cc5e581935" containerID="832a3f63b288851f0f39ca5969bec51f3da1b190c67dd7509a50ece13698d307" exitCode=0 Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.423324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerDied","Data":"832a3f63b288851f0f39ca5969bec51f3da1b190c67dd7509a50ece13698d307"} Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.561802 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651525 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rkv\" (UniqueName: \"kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651664 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651706 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651748 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.651879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts\") pod \"d5629c18-5c10-48be-9321-a7cc5e581935\" (UID: \"d5629c18-5c10-48be-9321-a7cc5e581935\") " Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.652265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.652875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.653895 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.653930 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d5629c18-5c10-48be-9321-a7cc5e581935-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.656843 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv" (OuterVolumeSpecName: "kube-api-access-z5rkv") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "kube-api-access-z5rkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.666048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts" (OuterVolumeSpecName: "scripts") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.678908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.732818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.755923 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.755961 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.755977 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.755988 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5rkv\" (UniqueName: \"kubernetes.io/projected/d5629c18-5c10-48be-9321-a7cc5e581935-kube-api-access-z5rkv\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.768893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data" (OuterVolumeSpecName: "config-data") pod "d5629c18-5c10-48be-9321-a7cc5e581935" (UID: "d5629c18-5c10-48be-9321-a7cc5e581935"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:06 crc kubenswrapper[4764]: I1204 00:02:06.858046 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5629c18-5c10-48be-9321-a7cc5e581935-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.434809 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d5629c18-5c10-48be-9321-a7cc5e581935","Type":"ContainerDied","Data":"face92f86eed5704bafd7b1c252522e063e44ecb4a9c14bdf7242ccc21d2f188"} Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.435106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.435133 4764 scope.go:117] "RemoveContainer" containerID="2a93c486c57189f55f3f1d336829763ddab47dc89ff03482fb6705757bee8d8b" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.464977 4764 scope.go:117] "RemoveContainer" containerID="df97c5c023edc3f845d2990204fedcebfaf0a624b684e1c0aec3cd81bf8476f0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.473709 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.483618 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.496357 4764 scope.go:117] "RemoveContainer" containerID="85641260207dbefbcac874908167fd07d8cfeb6a940dc58f5b8ec083fc859674" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.500616 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:07 crc kubenswrapper[4764]: E1204 00:02:07.501067 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-notification-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501086 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-notification-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: E1204 00:02:07.501110 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="proxy-httpd" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501118 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="proxy-httpd" Dec 04 00:02:07 crc kubenswrapper[4764]: E1204 00:02:07.501139 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-central-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501147 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-central-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: E1204 00:02:07.501166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="sg-core" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501173 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="sg-core" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501375 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-notification-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501394 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="proxy-httpd" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501411 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="sg-core" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.501424 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" containerName="ceilometer-central-agent" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.503817 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.515150 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.538167 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.547632 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.561144 4764 scope.go:117] "RemoveContainer" containerID="832a3f63b288851f0f39ca5969bec51f3da1b190c67dd7509a50ece13698d307" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.674820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj264\" (UniqueName: \"kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675018 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675402 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675436 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.675466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776615 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776646 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776779 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj264\" (UniqueName: \"kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776823 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.776846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.777279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.777295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.781580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.785602 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.790308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.796628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.804584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj264\" (UniqueName: \"kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264\") pod \"ceilometer-0\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " pod="openstack/ceilometer-0" Dec 04 00:02:07 crc kubenswrapper[4764]: I1204 00:02:07.862418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.126909 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.127150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.165510 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.176857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.381649 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:08 crc kubenswrapper[4764]: W1204 00:02:08.395829 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40c60149_2fa7_4853_9a5e_46e48f4b5064.slice/crio-0c9f0d99cb5b8f014eec0f58b6ef414b7ffeefc4d1a1d6ac99a2105990a69ffe WatchSource:0}: Error finding container 0c9f0d99cb5b8f014eec0f58b6ef414b7ffeefc4d1a1d6ac99a2105990a69ffe: Status 404 returned error can't find the container with id 0c9f0d99cb5b8f014eec0f58b6ef414b7ffeefc4d1a1d6ac99a2105990a69ffe Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.444394 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerStarted","Data":"0c9f0d99cb5b8f014eec0f58b6ef414b7ffeefc4d1a1d6ac99a2105990a69ffe"} Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.444982 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.445199 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 00:02:08 crc kubenswrapper[4764]: I1204 00:02:08.557856 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5629c18-5c10-48be-9321-a7cc5e581935" path="/var/lib/kubelet/pods/d5629c18-5c10-48be-9321-a7cc5e581935/volumes" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.048022 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.048350 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.096979 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.114171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.454950 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerStarted","Data":"4316fd953cf556b87293313b1e17d584252eeef5613acf8fb938195e2ce326bc"} Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.455297 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:09 crc kubenswrapper[4764]: I1204 00:02:09.455315 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:10 crc kubenswrapper[4764]: I1204 00:02:10.289399 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 00:02:10 crc kubenswrapper[4764]: I1204 00:02:10.447419 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 00:02:10 crc kubenswrapper[4764]: I1204 00:02:10.469066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerStarted","Data":"c0cd99d360040f0bc28dac135f73a0e8eef60b96b1e783e231b98480534a1f33"} Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.513139 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cxmtq"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.514599 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.525639 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cxmtq"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.546906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.546995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn5m6\" (UniqueName: \"kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.609475 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-6pvq9"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.610747 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.624649 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6pvq9"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.648767 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn5m6\" (UniqueName: \"kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.648856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.648935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwq8s\" (UniqueName: \"kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.648964 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.649679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.672171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn5m6\" (UniqueName: \"kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6\") pod \"nova-api-db-create-cxmtq\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.712386 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-m4ntf"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.715117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.730683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m4ntf"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.750699 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.750769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b48v6\" (UniqueName: \"kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.750824 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.750905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwq8s\" (UniqueName: \"kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.751816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.757219 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.757306 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.770884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwq8s\" (UniqueName: \"kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s\") pod \"nova-cell0-db-create-6pvq9\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.830309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.855781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.855841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b48v6\" (UniqueName: \"kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.857391 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.873537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b48v6\" (UniqueName: \"kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6\") pod \"nova-cell1-db-create-m4ntf\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.897158 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.921030 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fdfe-account-create-update-s6znl"] Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.922143 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.925173 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.926227 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.957625 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvnh\" (UniqueName: \"kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.957692 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:11 crc kubenswrapper[4764]: I1204 00:02:11.973140 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdfe-account-create-update-s6znl"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.045339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.060390 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvnh\" (UniqueName: \"kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.060455 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.061180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.083684 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvnh\" (UniqueName: \"kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh\") pod \"nova-api-fdfe-account-create-update-s6znl\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.132090 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4790-account-create-update-dbw9n"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.134586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.138986 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.159901 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4790-account-create-update-dbw9n"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.161586 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.161642 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wwc\" (UniqueName: \"kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.247858 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.263514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.263613 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wwc\" (UniqueName: \"kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.265782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.291623 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wwc\" (UniqueName: \"kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc\") pod \"nova-cell0-4790-account-create-update-dbw9n\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.342931 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2e9e-account-create-update-6nxlv"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.344460 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.349225 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.364082 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2e9e-account-create-update-6nxlv"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.364889 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxqk8\" (UniqueName: \"kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.365013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.462547 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.466524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.466653 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxqk8\" (UniqueName: \"kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.467313 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.487172 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxqk8\" (UniqueName: \"kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8\") pod \"nova-cell1-2e9e-account-create-update-6nxlv\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.588926 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-6pvq9"] Dec 04 00:02:12 crc kubenswrapper[4764]: W1204 00:02:12.603230 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf618e733_cb75_4d62_ac56_525007f16fb7.slice/crio-0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903 WatchSource:0}: Error finding container 0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903: Status 404 returned error can't find the container with id 0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903 Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.660236 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.660777 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cxmtq"] Dec 04 00:02:12 crc kubenswrapper[4764]: W1204 00:02:12.681863 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8da482_3171_426d_b3ed_41db82605e2a.slice/crio-5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0 WatchSource:0}: Error finding container 5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0: Status 404 returned error can't find the container with id 5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0 Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.712045 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-m4ntf"] Dec 04 00:02:12 crc kubenswrapper[4764]: I1204 00:02:12.978159 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdfe-account-create-update-s6znl"] Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.176781 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4790-account-create-update-dbw9n"] Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.434948 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2e9e-account-create-update-6nxlv"] Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.515378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxmtq" event={"ID":"6d8da482-3171-426d-b3ed-41db82605e2a","Type":"ContainerStarted","Data":"29a9588dd349d2237fc97952dcf0d1d4dd46c4740977eb7c054e0a2cb436f498"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.515419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxmtq" event={"ID":"6d8da482-3171-426d-b3ed-41db82605e2a","Type":"ContainerStarted","Data":"5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.519056 4764 generic.go:334] "Generic (PLEG): container finished" podID="f618e733-cb75-4d62-ac56-525007f16fb7" containerID="5dd7869d059a76707a2c2390bd0370902c7d32aa5bdbbcfb6ebc6bde71de3eb3" exitCode=0 Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.519173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pvq9" event={"ID":"f618e733-cb75-4d62-ac56-525007f16fb7","Type":"ContainerDied","Data":"5dd7869d059a76707a2c2390bd0370902c7d32aa5bdbbcfb6ebc6bde71de3eb3"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.519193 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pvq9" event={"ID":"f618e733-cb75-4d62-ac56-525007f16fb7","Type":"ContainerStarted","Data":"0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.525852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerStarted","Data":"ff1a426bc45ad35f0283b0b95f32a43c3728cc5d8fa9b80443f419eff026f927"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.533220 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m4ntf" event={"ID":"f9a10db5-2c78-495e-b81c-d0c89e9425ac","Type":"ContainerStarted","Data":"6b48922dbe5bd6a9e5a211a4a63d2a70064129d41782d8dbd285376d4d5294eb"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.533278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m4ntf" event={"ID":"f9a10db5-2c78-495e-b81c-d0c89e9425ac","Type":"ContainerStarted","Data":"3a5cbb19ea2a53b2bb233f39969d13582c0109a954d617c9e4481654f13aac7c"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.536983 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-cxmtq" podStartSLOduration=2.536966364 podStartE2EDuration="2.536966364s" podCreationTimestamp="2025-12-04 00:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:13.526687002 +0000 UTC m=+1269.288011413" watchObservedRunningTime="2025-12-04 00:02:13.536966364 +0000 UTC m=+1269.298290765" Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.539995 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" event={"ID":"dcd4ab5b-3e62-4708-8322-424df55d8cf4","Type":"ContainerStarted","Data":"70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.540030 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" event={"ID":"dcd4ab5b-3e62-4708-8322-424df55d8cf4","Type":"ContainerStarted","Data":"abd584bf9509d3034546a58ab5ece04c62d932f94ae9dba91901485dc6223ddd"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.541649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" event={"ID":"2012c377-0bc0-43e3-a919-6b6f753d9dde","Type":"ContainerStarted","Data":"3fd4e14a57dcc59d1d63529677199fa49e1e28823c6383b51dc2c0446ccc85ee"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.542993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdfe-account-create-update-s6znl" event={"ID":"d82499c8-3378-44b1-83f4-db79e6bd190b","Type":"ContainerStarted","Data":"e37bdb0112bb5af89123cbef2d5134d36eba12df511aac7c26c14ffd85dc3bc2"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.543018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdfe-account-create-update-s6znl" event={"ID":"d82499c8-3378-44b1-83f4-db79e6bd190b","Type":"ContainerStarted","Data":"5263cf91359f74756ed105bf74229171e4a8f0daccf32940df5517bb92501e42"} Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.577241 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" podStartSLOduration=1.577215531 podStartE2EDuration="1.577215531s" podCreationTimestamp="2025-12-04 00:02:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:13.568364714 +0000 UTC m=+1269.329689125" watchObservedRunningTime="2025-12-04 00:02:13.577215531 +0000 UTC m=+1269.338539942" Dec 04 00:02:13 crc kubenswrapper[4764]: I1204 00:02:13.595149 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-fdfe-account-create-update-s6znl" podStartSLOduration=2.595126251 podStartE2EDuration="2.595126251s" podCreationTimestamp="2025-12-04 00:02:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:13.586308964 +0000 UTC m=+1269.347633375" watchObservedRunningTime="2025-12-04 00:02:13.595126251 +0000 UTC m=+1269.356450662" Dec 04 00:02:13 crc kubenswrapper[4764]: E1204 00:02:13.950142 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd4ab5b_3e62_4708_8322_424df55d8cf4.slice/crio-conmon-70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddcd4ab5b_3e62_4708_8322_424df55d8cf4.slice/crio-70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa.scope\": RecentStats: unable to find data in memory cache]" Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.557841 4764 generic.go:334] "Generic (PLEG): container finished" podID="dcd4ab5b-3e62-4708-8322-424df55d8cf4" containerID="70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa" exitCode=0 Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.558095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" event={"ID":"dcd4ab5b-3e62-4708-8322-424df55d8cf4","Type":"ContainerDied","Data":"70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.559564 4764 generic.go:334] "Generic (PLEG): container finished" podID="2012c377-0bc0-43e3-a919-6b6f753d9dde" containerID="e09650ac3d53e1b4d98c00a093044328b22aa9b28e30aaf0b37a44249f841cb1" exitCode=0 Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.559605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" event={"ID":"2012c377-0bc0-43e3-a919-6b6f753d9dde","Type":"ContainerDied","Data":"e09650ac3d53e1b4d98c00a093044328b22aa9b28e30aaf0b37a44249f841cb1"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.560785 4764 generic.go:334] "Generic (PLEG): container finished" podID="d82499c8-3378-44b1-83f4-db79e6bd190b" containerID="e37bdb0112bb5af89123cbef2d5134d36eba12df511aac7c26c14ffd85dc3bc2" exitCode=0 Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.560824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdfe-account-create-update-s6znl" event={"ID":"d82499c8-3378-44b1-83f4-db79e6bd190b","Type":"ContainerDied","Data":"e37bdb0112bb5af89123cbef2d5134d36eba12df511aac7c26c14ffd85dc3bc2"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.562143 4764 generic.go:334] "Generic (PLEG): container finished" podID="6d8da482-3171-426d-b3ed-41db82605e2a" containerID="29a9588dd349d2237fc97952dcf0d1d4dd46c4740977eb7c054e0a2cb436f498" exitCode=0 Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.562181 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxmtq" event={"ID":"6d8da482-3171-426d-b3ed-41db82605e2a","Type":"ContainerDied","Data":"29a9588dd349d2237fc97952dcf0d1d4dd46c4740977eb7c054e0a2cb436f498"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.564008 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerStarted","Data":"34de28885856e59fcb6118983eb2652bbfd3e5a10570ff0d2e995a866bc3f5e5"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.564856 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.565875 4764 generic.go:334] "Generic (PLEG): container finished" podID="f9a10db5-2c78-495e-b81c-d0c89e9425ac" containerID="6b48922dbe5bd6a9e5a211a4a63d2a70064129d41782d8dbd285376d4d5294eb" exitCode=0 Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.566038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m4ntf" event={"ID":"f9a10db5-2c78-495e-b81c-d0c89e9425ac","Type":"ContainerDied","Data":"6b48922dbe5bd6a9e5a211a4a63d2a70064129d41782d8dbd285376d4d5294eb"} Dec 04 00:02:14 crc kubenswrapper[4764]: I1204 00:02:14.623770 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.131917595 podStartE2EDuration="7.623755969s" podCreationTimestamp="2025-12-04 00:02:07 +0000 UTC" firstStartedPulling="2025-12-04 00:02:08.398785124 +0000 UTC m=+1264.160109545" lastFinishedPulling="2025-12-04 00:02:13.890623508 +0000 UTC m=+1269.651947919" observedRunningTime="2025-12-04 00:02:14.622407026 +0000 UTC m=+1270.383731437" watchObservedRunningTime="2025-12-04 00:02:14.623755969 +0000 UTC m=+1270.385080380" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.079091 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.092505 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.238587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts\") pod \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.238752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwq8s\" (UniqueName: \"kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s\") pod \"f618e733-cb75-4d62-ac56-525007f16fb7\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.238847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts\") pod \"f618e733-cb75-4d62-ac56-525007f16fb7\" (UID: \"f618e733-cb75-4d62-ac56-525007f16fb7\") " Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.238868 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b48v6\" (UniqueName: \"kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6\") pod \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\" (UID: \"f9a10db5-2c78-495e-b81c-d0c89e9425ac\") " Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.240847 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f618e733-cb75-4d62-ac56-525007f16fb7" (UID: "f618e733-cb75-4d62-ac56-525007f16fb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.241187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9a10db5-2c78-495e-b81c-d0c89e9425ac" (UID: "f9a10db5-2c78-495e-b81c-d0c89e9425ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.244265 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6" (OuterVolumeSpecName: "kube-api-access-b48v6") pod "f9a10db5-2c78-495e-b81c-d0c89e9425ac" (UID: "f9a10db5-2c78-495e-b81c-d0c89e9425ac"). InnerVolumeSpecName "kube-api-access-b48v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.254097 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s" (OuterVolumeSpecName: "kube-api-access-jwq8s") pod "f618e733-cb75-4d62-ac56-525007f16fb7" (UID: "f618e733-cb75-4d62-ac56-525007f16fb7"). InnerVolumeSpecName "kube-api-access-jwq8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.340410 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwq8s\" (UniqueName: \"kubernetes.io/projected/f618e733-cb75-4d62-ac56-525007f16fb7-kube-api-access-jwq8s\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.341540 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f618e733-cb75-4d62-ac56-525007f16fb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.341578 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b48v6\" (UniqueName: \"kubernetes.io/projected/f9a10db5-2c78-495e-b81c-d0c89e9425ac-kube-api-access-b48v6\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.341589 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a10db5-2c78-495e-b81c-d0c89e9425ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.583892 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-6pvq9" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.583909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-6pvq9" event={"ID":"f618e733-cb75-4d62-ac56-525007f16fb7","Type":"ContainerDied","Data":"0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903"} Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.584314 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f3f06fe90788671f8c420b54c0df36368d4588c2899da759b0828c089893903" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.593456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-m4ntf" event={"ID":"f9a10db5-2c78-495e-b81c-d0c89e9425ac","Type":"ContainerDied","Data":"3a5cbb19ea2a53b2bb233f39969d13582c0109a954d617c9e4481654f13aac7c"} Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.593494 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a5cbb19ea2a53b2bb233f39969d13582c0109a954d617c9e4481654f13aac7c" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.593707 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-m4ntf" Dec 04 00:02:15 crc kubenswrapper[4764]: I1204 00:02:15.922309 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.055583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts\") pod \"d82499c8-3378-44b1-83f4-db79e6bd190b\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.055637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cvnh\" (UniqueName: \"kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh\") pod \"d82499c8-3378-44b1-83f4-db79e6bd190b\" (UID: \"d82499c8-3378-44b1-83f4-db79e6bd190b\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.060398 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d82499c8-3378-44b1-83f4-db79e6bd190b" (UID: "d82499c8-3378-44b1-83f4-db79e6bd190b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.076946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh" (OuterVolumeSpecName: "kube-api-access-6cvnh") pod "d82499c8-3378-44b1-83f4-db79e6bd190b" (UID: "d82499c8-3378-44b1-83f4-db79e6bd190b"). InnerVolumeSpecName "kube-api-access-6cvnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.156413 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.157467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts\") pod \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.157925 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6wwc\" (UniqueName: \"kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc\") pod \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\" (UID: \"dcd4ab5b-3e62-4708-8322-424df55d8cf4\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.158166 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcd4ab5b-3e62-4708-8322-424df55d8cf4" (UID: "dcd4ab5b-3e62-4708-8322-424df55d8cf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.158756 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcd4ab5b-3e62-4708-8322-424df55d8cf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.158824 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d82499c8-3378-44b1-83f4-db79e6bd190b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.158877 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cvnh\" (UniqueName: \"kubernetes.io/projected/d82499c8-3378-44b1-83f4-db79e6bd190b-kube-api-access-6cvnh\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.163894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc" (OuterVolumeSpecName: "kube-api-access-d6wwc") pod "dcd4ab5b-3e62-4708-8322-424df55d8cf4" (UID: "dcd4ab5b-3e62-4708-8322-424df55d8cf4"). InnerVolumeSpecName "kube-api-access-d6wwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.260229 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6wwc\" (UniqueName: \"kubernetes.io/projected/dcd4ab5b-3e62-4708-8322-424df55d8cf4-kube-api-access-d6wwc\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.272730 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.281937 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.360678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts\") pod \"2012c377-0bc0-43e3-a919-6b6f753d9dde\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.360759 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts\") pod \"6d8da482-3171-426d-b3ed-41db82605e2a\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.360787 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxqk8\" (UniqueName: \"kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8\") pod \"2012c377-0bc0-43e3-a919-6b6f753d9dde\" (UID: \"2012c377-0bc0-43e3-a919-6b6f753d9dde\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.360824 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn5m6\" (UniqueName: \"kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6\") pod \"6d8da482-3171-426d-b3ed-41db82605e2a\" (UID: \"6d8da482-3171-426d-b3ed-41db82605e2a\") " Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.364174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d8da482-3171-426d-b3ed-41db82605e2a" (UID: "6d8da482-3171-426d-b3ed-41db82605e2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.364320 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2012c377-0bc0-43e3-a919-6b6f753d9dde" (UID: "2012c377-0bc0-43e3-a919-6b6f753d9dde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.367249 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8" (OuterVolumeSpecName: "kube-api-access-lxqk8") pod "2012c377-0bc0-43e3-a919-6b6f753d9dde" (UID: "2012c377-0bc0-43e3-a919-6b6f753d9dde"). InnerVolumeSpecName "kube-api-access-lxqk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.367915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6" (OuterVolumeSpecName: "kube-api-access-qn5m6") pod "6d8da482-3171-426d-b3ed-41db82605e2a" (UID: "6d8da482-3171-426d-b3ed-41db82605e2a"). InnerVolumeSpecName "kube-api-access-qn5m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.462235 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2012c377-0bc0-43e3-a919-6b6f753d9dde-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.462268 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d8da482-3171-426d-b3ed-41db82605e2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.462278 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxqk8\" (UniqueName: \"kubernetes.io/projected/2012c377-0bc0-43e3-a919-6b6f753d9dde-kube-api-access-lxqk8\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.462291 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn5m6\" (UniqueName: \"kubernetes.io/projected/6d8da482-3171-426d-b3ed-41db82605e2a-kube-api-access-qn5m6\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.605772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" event={"ID":"dcd4ab5b-3e62-4708-8322-424df55d8cf4","Type":"ContainerDied","Data":"abd584bf9509d3034546a58ab5ece04c62d932f94ae9dba91901485dc6223ddd"} Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.605811 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd584bf9509d3034546a58ab5ece04c62d932f94ae9dba91901485dc6223ddd" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.605869 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4790-account-create-update-dbw9n" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.607990 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" event={"ID":"2012c377-0bc0-43e3-a919-6b6f753d9dde","Type":"ContainerDied","Data":"3fd4e14a57dcc59d1d63529677199fa49e1e28823c6383b51dc2c0446ccc85ee"} Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.608025 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fd4e14a57dcc59d1d63529677199fa49e1e28823c6383b51dc2c0446ccc85ee" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.608079 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2e9e-account-create-update-6nxlv" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.612331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdfe-account-create-update-s6znl" event={"ID":"d82499c8-3378-44b1-83f4-db79e6bd190b","Type":"ContainerDied","Data":"5263cf91359f74756ed105bf74229171e4a8f0daccf32940df5517bb92501e42"} Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.612379 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5263cf91359f74756ed105bf74229171e4a8f0daccf32940df5517bb92501e42" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.612433 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdfe-account-create-update-s6znl" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.615685 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxmtq" Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.615883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxmtq" event={"ID":"6d8da482-3171-426d-b3ed-41db82605e2a","Type":"ContainerDied","Data":"5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0"} Dec 04 00:02:16 crc kubenswrapper[4764]: I1204 00:02:16.615913 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ad13e63aa42fba21dec878c3a833b46e6aab79fb12306fffd949689cb8a4ae0" Dec 04 00:02:17 crc kubenswrapper[4764]: I1204 00:02:17.116836 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:17 crc kubenswrapper[4764]: I1204 00:02:17.622479 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-central-agent" containerID="cri-o://4316fd953cf556b87293313b1e17d584252eeef5613acf8fb938195e2ce326bc" gracePeriod=30 Dec 04 00:02:17 crc kubenswrapper[4764]: I1204 00:02:17.622515 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="sg-core" containerID="cri-o://ff1a426bc45ad35f0283b0b95f32a43c3728cc5d8fa9b80443f419eff026f927" gracePeriod=30 Dec 04 00:02:17 crc kubenswrapper[4764]: I1204 00:02:17.622578 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-notification-agent" containerID="cri-o://c0cd99d360040f0bc28dac135f73a0e8eef60b96b1e783e231b98480534a1f33" gracePeriod=30 Dec 04 00:02:17 crc kubenswrapper[4764]: I1204 00:02:17.622547 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="proxy-httpd" containerID="cri-o://34de28885856e59fcb6118983eb2652bbfd3e5a10570ff0d2e995a866bc3f5e5" gracePeriod=30 Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634405 4764 generic.go:334] "Generic (PLEG): container finished" podID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerID="34de28885856e59fcb6118983eb2652bbfd3e5a10570ff0d2e995a866bc3f5e5" exitCode=0 Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634734 4764 generic.go:334] "Generic (PLEG): container finished" podID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerID="ff1a426bc45ad35f0283b0b95f32a43c3728cc5d8fa9b80443f419eff026f927" exitCode=2 Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634748 4764 generic.go:334] "Generic (PLEG): container finished" podID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerID="c0cd99d360040f0bc28dac135f73a0e8eef60b96b1e783e231b98480534a1f33" exitCode=0 Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634442 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerDied","Data":"34de28885856e59fcb6118983eb2652bbfd3e5a10570ff0d2e995a866bc3f5e5"} Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerDied","Data":"ff1a426bc45ad35f0283b0b95f32a43c3728cc5d8fa9b80443f419eff026f927"} Dec 04 00:02:18 crc kubenswrapper[4764]: I1204 00:02:18.634861 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerDied","Data":"c0cd99d360040f0bc28dac135f73a0e8eef60b96b1e783e231b98480534a1f33"} Dec 04 00:02:20 crc kubenswrapper[4764]: I1204 00:02:20.868618 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:02:20 crc kubenswrapper[4764]: I1204 00:02:20.868940 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:02:20 crc kubenswrapper[4764]: I1204 00:02:20.868984 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:02:20 crc kubenswrapper[4764]: I1204 00:02:20.869655 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:02:20 crc kubenswrapper[4764]: I1204 00:02:20.869703 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227" gracePeriod=600 Dec 04 00:02:21 crc kubenswrapper[4764]: I1204 00:02:21.659434 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227" exitCode=0 Dec 04 00:02:21 crc kubenswrapper[4764]: I1204 00:02:21.659497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227"} Dec 04 00:02:21 crc kubenswrapper[4764]: I1204 00:02:21.659831 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a"} Dec 04 00:02:21 crc kubenswrapper[4764]: I1204 00:02:21.659880 4764 scope.go:117] "RemoveContainer" containerID="6a4a22d80a831b04f5a3234f6450f79d1fb6db8ec2fe0aa77fcaeb8ebd9ef8e9" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.367089 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrx6k"] Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370826 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f618e733-cb75-4d62-ac56-525007f16fb7" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370857 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f618e733-cb75-4d62-ac56-525007f16fb7" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370881 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd4ab5b-3e62-4708-8322-424df55d8cf4" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370887 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd4ab5b-3e62-4708-8322-424df55d8cf4" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370916 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a10db5-2c78-495e-b81c-d0c89e9425ac" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a10db5-2c78-495e-b81c-d0c89e9425ac" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370933 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8da482-3171-426d-b3ed-41db82605e2a" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370939 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8da482-3171-426d-b3ed-41db82605e2a" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370966 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2012c377-0bc0-43e3-a919-6b6f753d9dde" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370972 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2012c377-0bc0-43e3-a919-6b6f753d9dde" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: E1204 00:02:22.370984 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82499c8-3378-44b1-83f4-db79e6bd190b" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.370989 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82499c8-3378-44b1-83f4-db79e6bd190b" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371334 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a10db5-2c78-495e-b81c-d0c89e9425ac" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371345 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f618e733-cb75-4d62-ac56-525007f16fb7" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371361 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8da482-3171-426d-b3ed-41db82605e2a" containerName="mariadb-database-create" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371372 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2012c377-0bc0-43e3-a919-6b6f753d9dde" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371387 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd4ab5b-3e62-4708-8322-424df55d8cf4" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.371397 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82499c8-3378-44b1-83f4-db79e6bd190b" containerName="mariadb-account-create-update" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.372108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.384810 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.385363 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rk5k4" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.386032 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.393161 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrx6k"] Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.459776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.460004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.460058 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwrf\" (UniqueName: \"kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.460116 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.562869 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.562926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.562985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwrf\" (UniqueName: \"kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.563042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.569294 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.575615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.590295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.593341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwrf\" (UniqueName: \"kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf\") pod \"nova-cell0-conductor-db-sync-hrx6k\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.669242 4764 generic.go:334] "Generic (PLEG): container finished" podID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerID="4316fd953cf556b87293313b1e17d584252eeef5613acf8fb938195e2ce326bc" exitCode=0 Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.669305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerDied","Data":"4316fd953cf556b87293313b1e17d584252eeef5613acf8fb938195e2ce326bc"} Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.719692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.757481 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.867641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.867822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.867879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.867983 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj264\" (UniqueName: \"kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.868045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.868075 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.868135 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle\") pod \"40c60149-2fa7-4853-9a5e-46e48f4b5064\" (UID: \"40c60149-2fa7-4853-9a5e-46e48f4b5064\") " Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.871966 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.872061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.875801 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts" (OuterVolumeSpecName: "scripts") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.875963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264" (OuterVolumeSpecName: "kube-api-access-xj264") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "kube-api-access-xj264". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.904155 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.972233 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.972270 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xj264\" (UniqueName: \"kubernetes.io/projected/40c60149-2fa7-4853-9a5e-46e48f4b5064-kube-api-access-xj264\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.972282 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.972291 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.972301 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40c60149-2fa7-4853-9a5e-46e48f4b5064-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.982939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:22 crc kubenswrapper[4764]: I1204 00:02:22.998622 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data" (OuterVolumeSpecName: "config-data") pod "40c60149-2fa7-4853-9a5e-46e48f4b5064" (UID: "40c60149-2fa7-4853-9a5e-46e48f4b5064"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.073630 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.073988 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40c60149-2fa7-4853-9a5e-46e48f4b5064-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.217039 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrx6k"] Dec 04 00:02:23 crc kubenswrapper[4764]: W1204 00:02:23.222321 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e8e96e1_9fbf_416e_a8ab_91b0f8f98946.slice/crio-870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065 WatchSource:0}: Error finding container 870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065: Status 404 returned error can't find the container with id 870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065 Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.682133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" event={"ID":"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946","Type":"ContainerStarted","Data":"870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065"} Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.686534 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"40c60149-2fa7-4853-9a5e-46e48f4b5064","Type":"ContainerDied","Data":"0c9f0d99cb5b8f014eec0f58b6ef414b7ffeefc4d1a1d6ac99a2105990a69ffe"} Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.686589 4764 scope.go:117] "RemoveContainer" containerID="34de28885856e59fcb6118983eb2652bbfd3e5a10570ff0d2e995a866bc3f5e5" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.686855 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.713257 4764 scope.go:117] "RemoveContainer" containerID="ff1a426bc45ad35f0283b0b95f32a43c3728cc5d8fa9b80443f419eff026f927" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.740772 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.741070 4764 scope.go:117] "RemoveContainer" containerID="c0cd99d360040f0bc28dac135f73a0e8eef60b96b1e783e231b98480534a1f33" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.759335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.773658 4764 scope.go:117] "RemoveContainer" containerID="4316fd953cf556b87293313b1e17d584252eeef5613acf8fb938195e2ce326bc" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775186 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:23 crc kubenswrapper[4764]: E1204 00:02:23.775554 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="proxy-httpd" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775571 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="proxy-httpd" Dec 04 00:02:23 crc kubenswrapper[4764]: E1204 00:02:23.775589 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="sg-core" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775597 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="sg-core" Dec 04 00:02:23 crc kubenswrapper[4764]: E1204 00:02:23.775608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-central-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775613 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-central-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: E1204 00:02:23.775623 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-notification-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775628 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-notification-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775873 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="sg-core" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775897 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="proxy-httpd" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775907 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-notification-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.775921 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" containerName="ceilometer-central-agent" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.777588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.780569 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.780764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.801036 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886665 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886737 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6sfg\" (UniqueName: \"kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.886958 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988245 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6sfg\" (UniqueName: \"kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988469 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.988502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.998526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.999495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:23 crc kubenswrapper[4764]: I1204 00:02:23.999741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.001216 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.014769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.015456 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.027375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6sfg\" (UniqueName: \"kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg\") pod \"ceilometer-0\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.101041 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.556507 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c60149-2fa7-4853-9a5e-46e48f4b5064" path="/var/lib/kubelet/pods/40c60149-2fa7-4853-9a5e-46e48f4b5064/volumes" Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.557744 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:02:24 crc kubenswrapper[4764]: W1204 00:02:24.566253 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac012962_724d_4075_b561_1b6a53a6d9f5.slice/crio-e21339d5d17f776663e1043f5ed920063a8f3e72a5e7e9273fcae4ceabf92fa3 WatchSource:0}: Error finding container e21339d5d17f776663e1043f5ed920063a8f3e72a5e7e9273fcae4ceabf92fa3: Status 404 returned error can't find the container with id e21339d5d17f776663e1043f5ed920063a8f3e72a5e7e9273fcae4ceabf92fa3 Dec 04 00:02:24 crc kubenswrapper[4764]: I1204 00:02:24.695327 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerStarted","Data":"e21339d5d17f776663e1043f5ed920063a8f3e72a5e7e9273fcae4ceabf92fa3"} Dec 04 00:02:25 crc kubenswrapper[4764]: I1204 00:02:25.708994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerStarted","Data":"f33db2316aaf8ebec702e3b5eba9bd15ed4374529daed1a43ad878fcd96daee0"} Dec 04 00:02:26 crc kubenswrapper[4764]: I1204 00:02:26.720472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerStarted","Data":"95e17231a256246bc718647605db323899aea854db67811fb3cec8329bbbd766"} Dec 04 00:02:32 crc kubenswrapper[4764]: I1204 00:02:32.793578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" event={"ID":"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946","Type":"ContainerStarted","Data":"fac18db9328e4748c82b0da435e7fd4230232dd05715f2d69d781fb2da6f13d5"} Dec 04 00:02:32 crc kubenswrapper[4764]: I1204 00:02:32.800206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerStarted","Data":"9462a871d3db42e08d77cec747102f2b4ac4c07d2a1b2cfd83ecbf72c52a60a0"} Dec 04 00:02:32 crc kubenswrapper[4764]: I1204 00:02:32.821322 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" podStartSLOduration=1.74918023 podStartE2EDuration="10.821290904s" podCreationTimestamp="2025-12-04 00:02:22 +0000 UTC" firstStartedPulling="2025-12-04 00:02:23.224764588 +0000 UTC m=+1278.986088999" lastFinishedPulling="2025-12-04 00:02:32.296875252 +0000 UTC m=+1288.058199673" observedRunningTime="2025-12-04 00:02:32.816073076 +0000 UTC m=+1288.577397517" watchObservedRunningTime="2025-12-04 00:02:32.821290904 +0000 UTC m=+1288.582615355" Dec 04 00:02:34 crc kubenswrapper[4764]: I1204 00:02:34.827274 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerStarted","Data":"81cdb76ca3b2d742d82a8ad9d4b0512affb4a0653f0db484dfa7546997e003ab"} Dec 04 00:02:34 crc kubenswrapper[4764]: I1204 00:02:34.827944 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:02:34 crc kubenswrapper[4764]: I1204 00:02:34.857920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3532247 podStartE2EDuration="11.857902063s" podCreationTimestamp="2025-12-04 00:02:23 +0000 UTC" firstStartedPulling="2025-12-04 00:02:24.569877318 +0000 UTC m=+1280.331201729" lastFinishedPulling="2025-12-04 00:02:34.074554681 +0000 UTC m=+1289.835879092" observedRunningTime="2025-12-04 00:02:34.851894296 +0000 UTC m=+1290.613218717" watchObservedRunningTime="2025-12-04 00:02:34.857902063 +0000 UTC m=+1290.619226474" Dec 04 00:02:42 crc kubenswrapper[4764]: I1204 00:02:42.906160 4764 generic.go:334] "Generic (PLEG): container finished" podID="2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" containerID="fac18db9328e4748c82b0da435e7fd4230232dd05715f2d69d781fb2da6f13d5" exitCode=0 Dec 04 00:02:42 crc kubenswrapper[4764]: I1204 00:02:42.906234 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" event={"ID":"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946","Type":"ContainerDied","Data":"fac18db9328e4748c82b0da435e7fd4230232dd05715f2d69d781fb2da6f13d5"} Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.268427 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.384766 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle\") pod \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.384985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data\") pod \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.385145 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts\") pod \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.385278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnwrf\" (UniqueName: \"kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf\") pod \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\" (UID: \"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946\") " Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.389875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts" (OuterVolumeSpecName: "scripts") pod "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" (UID: "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.391662 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf" (OuterVolumeSpecName: "kube-api-access-fnwrf") pod "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" (UID: "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946"). InnerVolumeSpecName "kube-api-access-fnwrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.428570 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data" (OuterVolumeSpecName: "config-data") pod "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" (UID: "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.432223 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" (UID: "2e8e96e1-9fbf-416e-a8ab-91b0f8f98946"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.488173 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.488217 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnwrf\" (UniqueName: \"kubernetes.io/projected/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-kube-api-access-fnwrf\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.488234 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.488247 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.930218 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" event={"ID":"2e8e96e1-9fbf-416e-a8ab-91b0f8f98946","Type":"ContainerDied","Data":"870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065"} Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.930258 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hrx6k" Dec 04 00:02:44 crc kubenswrapper[4764]: I1204 00:02:44.930263 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870f06502919e5897b82da51d75579ec705ed6dca6264c17852841c43df07065" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.037274 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:02:45 crc kubenswrapper[4764]: E1204 00:02:45.037736 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" containerName="nova-cell0-conductor-db-sync" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.037754 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" containerName="nova-cell0-conductor-db-sync" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.037991 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" containerName="nova-cell0-conductor-db-sync" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.038830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.042452 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.042467 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rk5k4" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.054074 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.104470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.104847 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.104890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql4gt\" (UniqueName: \"kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.206740 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.206814 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.206892 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql4gt\" (UniqueName: \"kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.212552 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.214042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.227495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql4gt\" (UniqueName: \"kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt\") pod \"nova-cell0-conductor-0\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.365419 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.659771 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:02:45 crc kubenswrapper[4764]: W1204 00:02:45.660493 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58eedbd8_7bbd_444f_bd11_784c5e7429fa.slice/crio-44a0f17b16daf8d4539230161a8b8e19c7c6a3b91b157f26cfa18b87763aa04e WatchSource:0}: Error finding container 44a0f17b16daf8d4539230161a8b8e19c7c6a3b91b157f26cfa18b87763aa04e: Status 404 returned error can't find the container with id 44a0f17b16daf8d4539230161a8b8e19c7c6a3b91b157f26cfa18b87763aa04e Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.945294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58eedbd8-7bbd-444f-bd11-784c5e7429fa","Type":"ContainerStarted","Data":"9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9"} Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.945334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58eedbd8-7bbd-444f-bd11-784c5e7429fa","Type":"ContainerStarted","Data":"44a0f17b16daf8d4539230161a8b8e19c7c6a3b91b157f26cfa18b87763aa04e"} Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.945622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:45 crc kubenswrapper[4764]: I1204 00:02:45.975699 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=0.975682739 podStartE2EDuration="975.682739ms" podCreationTimestamp="2025-12-04 00:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:45.964845393 +0000 UTC m=+1301.726169814" watchObservedRunningTime="2025-12-04 00:02:45.975682739 +0000 UTC m=+1301.737007150" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.395671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.870989 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bplfc"] Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.872373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.885477 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.885800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bplfc"] Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.886013 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.912476 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8xml\" (UniqueName: \"kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.912560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.912610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:50 crc kubenswrapper[4764]: I1204 00:02:50.912648 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.014691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8xml\" (UniqueName: \"kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.015018 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.015054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.015083 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.022629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.031097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.032446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.046393 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8xml\" (UniqueName: \"kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml\") pod \"nova-cell0-cell-mapping-bplfc\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.057399 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.060154 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.065150 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.093178 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.117077 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.117109 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.117130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.117173 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwsn9\" (UniqueName: \"kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.176814 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.178324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.188780 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.197745 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.199026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.201439 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.205678 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.214233 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.218290 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.218329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.218348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.218382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwsn9\" (UniqueName: \"kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.219129 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.227206 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.231666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.240444 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.260200 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwsn9\" (UniqueName: \"kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9\") pod \"nova-api-0\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.305904 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.307505 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6z5\" (UniqueName: \"kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321186 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxrm\" (UniqueName: \"kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.321346 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.356704 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.372038 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.373402 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.378168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.380807 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.422213 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423582 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmwd\" (UniqueName: \"kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423617 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6z5\" (UniqueName: \"kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423662 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxrm\" (UniqueName: \"kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423691 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.423811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.424629 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.430321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.433786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.434234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.436591 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.439400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6z5\" (UniqueName: \"kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5\") pod \"nova-scheduler-0\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.451584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxrm\" (UniqueName: \"kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm\") pod \"nova-metadata-0\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.510205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525402 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljskp\" (UniqueName: \"kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525532 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmwd\" (UniqueName: \"kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.525637 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.526477 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.526497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.526972 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.527209 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.527381 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.544265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmwd\" (UniqueName: \"kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd\") pod \"dnsmasq-dns-7bd87576bf-b6krd\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.622770 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.627567 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljskp\" (UniqueName: \"kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.627654 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.627770 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.631416 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.633788 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.645262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljskp\" (UniqueName: \"kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp\") pod \"nova-cell1-novncproxy-0\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.674803 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.709767 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.857471 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bplfc"] Dec 04 00:02:51 crc kubenswrapper[4764]: W1204 00:02:51.864616 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd131b557_f02e_4925_a9eb_52202bce1b00.slice/crio-ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14 WatchSource:0}: Error finding container ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14: Status 404 returned error can't find the container with id ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14 Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.933258 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h7p67"] Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.934390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.937190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.937230 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 00:02:51 crc kubenswrapper[4764]: I1204 00:02:51.942674 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h7p67"] Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.000473 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.031788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bplfc" event={"ID":"d131b557-f02e-4925-a9eb-52202bce1b00","Type":"ContainerStarted","Data":"ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14"} Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.036321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.036830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.036865 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4f4\" (UniqueName: \"kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.036930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.116185 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.150537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.150638 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.151061 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.151120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4f4\" (UniqueName: \"kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.165471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.165885 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.171460 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.191306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4f4\" (UniqueName: \"kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4\") pod \"nova-cell1-conductor-db-sync-h7p67\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.236672 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.284821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.314701 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:02:52 crc kubenswrapper[4764]: I1204 00:02:52.325375 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.066201 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h7p67"] Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.076651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"829e611e-4d52-4a09-b7a4-56d64bf2b892","Type":"ContainerStarted","Data":"71a0a2beb5d4572d8df3e17fff62989d8d477987046d72c0641788a2ad32b69f"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.084228 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerID="eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715" exitCode=0 Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.084312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" event={"ID":"e3a1bc90-81db-4765-82e7-d74d47aeb02b","Type":"ContainerDied","Data":"eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.084338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" event={"ID":"e3a1bc90-81db-4765-82e7-d74d47aeb02b","Type":"ContainerStarted","Data":"b88a2ae68a4a4f41d414b08614bc6dd13a4a67649e943512b9b0af66bcf216eb"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.096644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerStarted","Data":"e96c7e8b24c7802d43cf8f11f36c97259060b27a864d85d5789065d2efe665d2"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.106449 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bplfc" event={"ID":"d131b557-f02e-4925-a9eb-52202bce1b00","Type":"ContainerStarted","Data":"e26f24e6446b7d99d0fcfd4d59fe0434ee5a259e6dcb1fa6dde3cea82250aad3"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.112456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"10778b9d-7c88-4b04-adbc-ad78521192dd","Type":"ContainerStarted","Data":"b6dc7ca814500f89896454c22cecd077989d5fbbccdf024f3d05e80945efdbc9"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.143248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerStarted","Data":"8877b3f167d5b3dac35e796966255316f2264c388405a9db45f847845b16f206"} Dec 04 00:02:53 crc kubenswrapper[4764]: I1204 00:02:53.176323 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bplfc" podStartSLOduration=3.176127665 podStartE2EDuration="3.176127665s" podCreationTimestamp="2025-12-04 00:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:53.166998272 +0000 UTC m=+1308.928322683" watchObservedRunningTime="2025-12-04 00:02:53.176127665 +0000 UTC m=+1308.937452076" Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.153272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" event={"ID":"e3a1bc90-81db-4765-82e7-d74d47aeb02b","Type":"ContainerStarted","Data":"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca"} Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.154234 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.156310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h7p67" event={"ID":"cbd1176f-8fbf-442a-98ed-293aff954480","Type":"ContainerStarted","Data":"c2572b9c7d658fe7e9003dfd8ad292f6e1b9dfdbf825855b6980c2f1bc6cc2a5"} Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.156451 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h7p67" event={"ID":"cbd1176f-8fbf-442a-98ed-293aff954480","Type":"ContainerStarted","Data":"200e292e64e31e4701deafe1a30b530992989d8065816a0dfd5a088bf8e527d1"} Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.179638 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" podStartSLOduration=3.179620417 podStartE2EDuration="3.179620417s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:54.171793715 +0000 UTC m=+1309.933118136" watchObservedRunningTime="2025-12-04 00:02:54.179620417 +0000 UTC m=+1309.940944818" Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.190128 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.195823 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-h7p67" podStartSLOduration=3.195709172 podStartE2EDuration="3.195709172s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:02:54.19030811 +0000 UTC m=+1309.951632521" watchObservedRunningTime="2025-12-04 00:02:54.195709172 +0000 UTC m=+1309.957033593" Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.977942 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:54 crc kubenswrapper[4764]: I1204 00:02:54.986403 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.182538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"829e611e-4d52-4a09-b7a4-56d64bf2b892","Type":"ContainerStarted","Data":"dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.184660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerStarted","Data":"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.184729 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerStarted","Data":"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.184797 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-log" containerID="cri-o://80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" gracePeriod=30 Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.184911 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-metadata" containerID="cri-o://4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" gracePeriod=30 Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.186760 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"10778b9d-7c88-4b04-adbc-ad78521192dd","Type":"ContainerStarted","Data":"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.186875 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="10778b9d-7c88-4b04-adbc-ad78521192dd" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a" gracePeriod=30 Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.197581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerStarted","Data":"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.197637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerStarted","Data":"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36"} Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.213041 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.261922809 podStartE2EDuration="6.213006954s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="2025-12-04 00:02:52.232932642 +0000 UTC m=+1307.994257053" lastFinishedPulling="2025-12-04 00:02:56.184016787 +0000 UTC m=+1311.945341198" observedRunningTime="2025-12-04 00:02:57.209821856 +0000 UTC m=+1312.971146287" watchObservedRunningTime="2025-12-04 00:02:57.213006954 +0000 UTC m=+1312.974331365" Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.237302 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.181853336 podStartE2EDuration="6.23728281s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="2025-12-04 00:02:52.133306479 +0000 UTC m=+1307.894630890" lastFinishedPulling="2025-12-04 00:02:56.188735943 +0000 UTC m=+1311.950060364" observedRunningTime="2025-12-04 00:02:57.236430109 +0000 UTC m=+1312.997754540" watchObservedRunningTime="2025-12-04 00:02:57.23728281 +0000 UTC m=+1312.998607221" Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.270135 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.08783288 podStartE2EDuration="6.270112415s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="2025-12-04 00:02:52.00248039 +0000 UTC m=+1307.763804801" lastFinishedPulling="2025-12-04 00:02:56.184759925 +0000 UTC m=+1311.946084336" observedRunningTime="2025-12-04 00:02:57.259487354 +0000 UTC m=+1313.020811765" watchObservedRunningTime="2025-12-04 00:02:57.270112415 +0000 UTC m=+1313.031436826" Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.285639 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.454596005 podStartE2EDuration="6.285620585s" podCreationTimestamp="2025-12-04 00:02:51 +0000 UTC" firstStartedPulling="2025-12-04 00:02:52.353964831 +0000 UTC m=+1308.115289232" lastFinishedPulling="2025-12-04 00:02:56.184989401 +0000 UTC m=+1311.946313812" observedRunningTime="2025-12-04 00:02:57.277827834 +0000 UTC m=+1313.039152245" watchObservedRunningTime="2025-12-04 00:02:57.285620585 +0000 UTC m=+1313.046944996" Dec 04 00:02:57 crc kubenswrapper[4764]: I1204 00:02:57.869250 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.029150 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs\") pod \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.029200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data\") pod \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.029278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsxrm\" (UniqueName: \"kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm\") pod \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.029399 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle\") pod \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\" (UID: \"46da4bdf-7f2a-4a43-991f-2dc63a52019f\") " Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.029475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs" (OuterVolumeSpecName: "logs") pod "46da4bdf-7f2a-4a43-991f-2dc63a52019f" (UID: "46da4bdf-7f2a-4a43-991f-2dc63a52019f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.030000 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46da4bdf-7f2a-4a43-991f-2dc63a52019f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.035525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm" (OuterVolumeSpecName: "kube-api-access-qsxrm") pod "46da4bdf-7f2a-4a43-991f-2dc63a52019f" (UID: "46da4bdf-7f2a-4a43-991f-2dc63a52019f"). InnerVolumeSpecName "kube-api-access-qsxrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.068908 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46da4bdf-7f2a-4a43-991f-2dc63a52019f" (UID: "46da4bdf-7f2a-4a43-991f-2dc63a52019f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.085931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data" (OuterVolumeSpecName: "config-data") pod "46da4bdf-7f2a-4a43-991f-2dc63a52019f" (UID: "46da4bdf-7f2a-4a43-991f-2dc63a52019f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.111295 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.111539 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="660b3b11-42db-456f-997b-250a9120afc9" containerName="kube-state-metrics" containerID="cri-o://4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae" gracePeriod=30 Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.131508 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.131785 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46da4bdf-7f2a-4a43-991f-2dc63a52019f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.131888 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsxrm\" (UniqueName: \"kubernetes.io/projected/46da4bdf-7f2a-4a43-991f-2dc63a52019f-kube-api-access-qsxrm\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.207771 4764 generic.go:334] "Generic (PLEG): container finished" podID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerID="4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" exitCode=0 Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.207804 4764 generic.go:334] "Generic (PLEG): container finished" podID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerID="80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" exitCode=143 Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.208923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerDied","Data":"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c"} Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.208965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerDied","Data":"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00"} Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.208978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46da4bdf-7f2a-4a43-991f-2dc63a52019f","Type":"ContainerDied","Data":"e96c7e8b24c7802d43cf8f11f36c97259060b27a864d85d5789065d2efe665d2"} Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.208997 4764 scope.go:117] "RemoveContainer" containerID="4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.209129 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.308512 4764 scope.go:117] "RemoveContainer" containerID="80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.315834 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.333916 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.361010 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:58 crc kubenswrapper[4764]: E1204 00:02:58.361475 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-metadata" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.361492 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-metadata" Dec 04 00:02:58 crc kubenswrapper[4764]: E1204 00:02:58.361520 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-log" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.361527 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-log" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.361695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-metadata" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.361711 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" containerName="nova-metadata-log" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.362653 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.367401 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.367654 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.368887 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.384596 4764 scope.go:117] "RemoveContainer" containerID="4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" Dec 04 00:02:58 crc kubenswrapper[4764]: E1204 00:02:58.402977 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c\": container with ID starting with 4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c not found: ID does not exist" containerID="4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.403310 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c"} err="failed to get container status \"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c\": rpc error: code = NotFound desc = could not find container \"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c\": container with ID starting with 4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c not found: ID does not exist" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.403339 4764 scope.go:117] "RemoveContainer" containerID="80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" Dec 04 00:02:58 crc kubenswrapper[4764]: E1204 00:02:58.412184 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00\": container with ID starting with 80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00 not found: ID does not exist" containerID="80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.412219 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00"} err="failed to get container status \"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00\": rpc error: code = NotFound desc = could not find container \"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00\": container with ID starting with 80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00 not found: ID does not exist" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.412242 4764 scope.go:117] "RemoveContainer" containerID="4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.416522 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c"} err="failed to get container status \"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c\": rpc error: code = NotFound desc = could not find container \"4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c\": container with ID starting with 4a216bd98b0e0f8f8e7695fe365ba3c5a847957c63c7cb2f85252640dacf427c not found: ID does not exist" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.416560 4764 scope.go:117] "RemoveContainer" containerID="80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.416948 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00"} err="failed to get container status \"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00\": rpc error: code = NotFound desc = could not find container \"80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00\": container with ID starting with 80c5e92bbb371bd9c4ef83a0ad5e2061880ec6c2b0abe39a83a82aaf45dc4a00 not found: ID does not exist" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.439780 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.439830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.439939 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.439966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkfpp\" (UniqueName: \"kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.440013 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.540760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.540813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.540895 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.540916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkfpp\" (UniqueName: \"kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.540953 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.541344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.549886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.554259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.555603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.559093 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkfpp\" (UniqueName: \"kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp\") pod \"nova-metadata-0\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.568018 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46da4bdf-7f2a-4a43-991f-2dc63a52019f" path="/var/lib/kubelet/pods/46da4bdf-7f2a-4a43-991f-2dc63a52019f/volumes" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.660443 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.736676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.743473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h57rf\" (UniqueName: \"kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf\") pod \"660b3b11-42db-456f-997b-250a9120afc9\" (UID: \"660b3b11-42db-456f-997b-250a9120afc9\") " Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.753952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf" (OuterVolumeSpecName: "kube-api-access-h57rf") pod "660b3b11-42db-456f-997b-250a9120afc9" (UID: "660b3b11-42db-456f-997b-250a9120afc9"). InnerVolumeSpecName "kube-api-access-h57rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:02:58 crc kubenswrapper[4764]: I1204 00:02:58.845708 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h57rf\" (UniqueName: \"kubernetes.io/projected/660b3b11-42db-456f-997b-250a9120afc9-kube-api-access-h57rf\") on node \"crc\" DevicePath \"\"" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.187288 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.221989 4764 generic.go:334] "Generic (PLEG): container finished" podID="660b3b11-42db-456f-997b-250a9120afc9" containerID="4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae" exitCode=2 Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.222050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"660b3b11-42db-456f-997b-250a9120afc9","Type":"ContainerDied","Data":"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae"} Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.222074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"660b3b11-42db-456f-997b-250a9120afc9","Type":"ContainerDied","Data":"d479e4bb68de2ef7ac39b558e4900b23137e82cb2b705d4b7bcc075dd672d07f"} Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.222076 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.222091 4764 scope.go:117] "RemoveContainer" containerID="4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.223547 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerStarted","Data":"1ed2ea876d387f2cb12bcfd86fb5a6b47f4cc0996332f6846411bc2e567a7e1d"} Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.274449 4764 scope.go:117] "RemoveContainer" containerID="4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae" Dec 04 00:02:59 crc kubenswrapper[4764]: E1204 00:02:59.274857 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae\": container with ID starting with 4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae not found: ID does not exist" containerID="4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.274898 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae"} err="failed to get container status \"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae\": rpc error: code = NotFound desc = could not find container \"4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae\": container with ID starting with 4fb1f793071677a639830320b45dfc33ef12bc0be3eb808ca8a906be75ee4cae not found: ID does not exist" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.275694 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.287447 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.299664 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:02:59 crc kubenswrapper[4764]: E1204 00:02:59.300216 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660b3b11-42db-456f-997b-250a9120afc9" containerName="kube-state-metrics" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.300234 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="660b3b11-42db-456f-997b-250a9120afc9" containerName="kube-state-metrics" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.300490 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="660b3b11-42db-456f-997b-250a9120afc9" containerName="kube-state-metrics" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.304567 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.309193 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.309385 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.310279 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.355313 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.355415 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.355555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wmg\" (UniqueName: \"kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.355633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.456994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.457082 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.457127 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wmg\" (UniqueName: \"kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.457218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.461236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.461615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.462741 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.472589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wmg\" (UniqueName: \"kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg\") pod \"kube-state-metrics-0\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " pod="openstack/kube-state-metrics-0" Dec 04 00:02:59 crc kubenswrapper[4764]: I1204 00:02:59.667269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.042903 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.043678 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-central-agent" containerID="cri-o://f33db2316aaf8ebec702e3b5eba9bd15ed4374529daed1a43ad878fcd96daee0" gracePeriod=30 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.043771 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-notification-agent" containerID="cri-o://95e17231a256246bc718647605db323899aea854db67811fb3cec8329bbbd766" gracePeriod=30 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.043774 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="sg-core" containerID="cri-o://9462a871d3db42e08d77cec747102f2b4ac4c07d2a1b2cfd83ecbf72c52a60a0" gracePeriod=30 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.043929 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="proxy-httpd" containerID="cri-o://81cdb76ca3b2d742d82a8ad9d4b0512affb4a0653f0db484dfa7546997e003ab" gracePeriod=30 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.100574 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:03:00 crc kubenswrapper[4764]: W1204 00:03:00.104801 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef7e0298_05be_4a37_a1d3_44632ea1d770.slice/crio-28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97 WatchSource:0}: Error finding container 28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97: Status 404 returned error can't find the container with id 28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.234083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef7e0298-05be-4a37-a1d3-44632ea1d770","Type":"ContainerStarted","Data":"28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.236907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerStarted","Data":"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.236938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerStarted","Data":"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.239691 4764 generic.go:334] "Generic (PLEG): container finished" podID="d131b557-f02e-4925-a9eb-52202bce1b00" containerID="e26f24e6446b7d99d0fcfd4d59fe0434ee5a259e6dcb1fa6dde3cea82250aad3" exitCode=0 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.239891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bplfc" event={"ID":"d131b557-f02e-4925-a9eb-52202bce1b00","Type":"ContainerDied","Data":"e26f24e6446b7d99d0fcfd4d59fe0434ee5a259e6dcb1fa6dde3cea82250aad3"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.246970 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerID="81cdb76ca3b2d742d82a8ad9d4b0512affb4a0653f0db484dfa7546997e003ab" exitCode=0 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.247017 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerID="9462a871d3db42e08d77cec747102f2b4ac4c07d2a1b2cfd83ecbf72c52a60a0" exitCode=2 Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.247043 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerDied","Data":"81cdb76ca3b2d742d82a8ad9d4b0512affb4a0653f0db484dfa7546997e003ab"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.247073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerDied","Data":"9462a871d3db42e08d77cec747102f2b4ac4c07d2a1b2cfd83ecbf72c52a60a0"} Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.267107 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.267090509 podStartE2EDuration="2.267090509s" podCreationTimestamp="2025-12-04 00:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:00.257926874 +0000 UTC m=+1316.019251285" watchObservedRunningTime="2025-12-04 00:03:00.267090509 +0000 UTC m=+1316.028414920" Dec 04 00:03:00 crc kubenswrapper[4764]: I1204 00:03:00.576710 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="660b3b11-42db-456f-997b-250a9120afc9" path="/var/lib/kubelet/pods/660b3b11-42db-456f-997b-250a9120afc9/volumes" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.267889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef7e0298-05be-4a37-a1d3-44632ea1d770","Type":"ContainerStarted","Data":"f01aa606ef44b4da2e845d6aab65dfe450e2d390badf9f1d24c7c03e50b3beb3"} Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.268218 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.271910 4764 generic.go:334] "Generic (PLEG): container finished" podID="cbd1176f-8fbf-442a-98ed-293aff954480" containerID="c2572b9c7d658fe7e9003dfd8ad292f6e1b9dfdbf825855b6980c2f1bc6cc2a5" exitCode=0 Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.271980 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h7p67" event={"ID":"cbd1176f-8fbf-442a-98ed-293aff954480","Type":"ContainerDied","Data":"c2572b9c7d658fe7e9003dfd8ad292f6e1b9dfdbf825855b6980c2f1bc6cc2a5"} Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.285148 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerID="f33db2316aaf8ebec702e3b5eba9bd15ed4374529daed1a43ad878fcd96daee0" exitCode=0 Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.286828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerDied","Data":"f33db2316aaf8ebec702e3b5eba9bd15ed4374529daed1a43ad878fcd96daee0"} Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.308591 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9216557330000001 podStartE2EDuration="2.308570632s" podCreationTimestamp="2025-12-04 00:02:59 +0000 UTC" firstStartedPulling="2025-12-04 00:03:00.107257079 +0000 UTC m=+1315.868581490" lastFinishedPulling="2025-12-04 00:03:00.494171988 +0000 UTC m=+1316.255496389" observedRunningTime="2025-12-04 00:03:01.291270738 +0000 UTC m=+1317.052595219" watchObservedRunningTime="2025-12-04 00:03:01.308570632 +0000 UTC m=+1317.069895033" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.422966 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.424176 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.624890 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.625152 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.655426 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.677014 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.710406 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.712615 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.734174 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.734471 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="dnsmasq-dns" containerID="cri-o://6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412" gracePeriod=10 Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.898744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data\") pod \"d131b557-f02e-4925-a9eb-52202bce1b00\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.898806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts\") pod \"d131b557-f02e-4925-a9eb-52202bce1b00\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.899074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8xml\" (UniqueName: \"kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml\") pod \"d131b557-f02e-4925-a9eb-52202bce1b00\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.899176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle\") pod \"d131b557-f02e-4925-a9eb-52202bce1b00\" (UID: \"d131b557-f02e-4925-a9eb-52202bce1b00\") " Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.907154 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts" (OuterVolumeSpecName: "scripts") pod "d131b557-f02e-4925-a9eb-52202bce1b00" (UID: "d131b557-f02e-4925-a9eb-52202bce1b00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.934246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml" (OuterVolumeSpecName: "kube-api-access-g8xml") pod "d131b557-f02e-4925-a9eb-52202bce1b00" (UID: "d131b557-f02e-4925-a9eb-52202bce1b00"). InnerVolumeSpecName "kube-api-access-g8xml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.939071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data" (OuterVolumeSpecName: "config-data") pod "d131b557-f02e-4925-a9eb-52202bce1b00" (UID: "d131b557-f02e-4925-a9eb-52202bce1b00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:01 crc kubenswrapper[4764]: I1204 00:03:01.940843 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d131b557-f02e-4925-a9eb-52202bce1b00" (UID: "d131b557-f02e-4925-a9eb-52202bce1b00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.001154 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.001190 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.001202 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8xml\" (UniqueName: \"kubernetes.io/projected/d131b557-f02e-4925-a9eb-52202bce1b00-kube-api-access-g8xml\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.001216 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d131b557-f02e-4925-a9eb-52202bce1b00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.226204 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.324227 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerID="95e17231a256246bc718647605db323899aea854db67811fb3cec8329bbbd766" exitCode=0 Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.324321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerDied","Data":"95e17231a256246bc718647605db323899aea854db67811fb3cec8329bbbd766"} Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.330450 4764 generic.go:334] "Generic (PLEG): container finished" podID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerID="6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412" exitCode=0 Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.330526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" event={"ID":"040441ff-4e5b-4e97-aefa-01ebe3fe0720","Type":"ContainerDied","Data":"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412"} Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.330560 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" event={"ID":"040441ff-4e5b-4e97-aefa-01ebe3fe0720","Type":"ContainerDied","Data":"81820c376e808d0e5300bd27780a7cc11f992366f7f1bb7ce55efc27c9ae2a04"} Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.330578 4764 scope.go:117] "RemoveContainer" containerID="6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.330709 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56d54d44c7-6m772" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.335499 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bplfc" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.337493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bplfc" event={"ID":"d131b557-f02e-4925-a9eb-52202bce1b00","Type":"ContainerDied","Data":"ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14"} Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.337534 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea19e0b88b5df90257fb3786476fe4e4be1b603c2d61b107308367018048ac14" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.357058 4764 scope.go:117] "RemoveContainer" containerID="9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.369139 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.377290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.384329 4764 scope.go:117] "RemoveContainer" containerID="6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412" Dec 04 00:03:02 crc kubenswrapper[4764]: E1204 00:03:02.384865 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412\": container with ID starting with 6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412 not found: ID does not exist" containerID="6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.384894 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412"} err="failed to get container status \"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412\": rpc error: code = NotFound desc = could not find container \"6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412\": container with ID starting with 6a7437f1026a99343f7b5ec6b4b959706d1806eef897b7473a78053486520412 not found: ID does not exist" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.384919 4764 scope.go:117] "RemoveContainer" containerID="9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b" Dec 04 00:03:02 crc kubenswrapper[4764]: E1204 00:03:02.385116 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b\": container with ID starting with 9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b not found: ID does not exist" containerID="9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.385153 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b"} err="failed to get container status \"9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b\": rpc error: code = NotFound desc = could not find container \"9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b\": container with ID starting with 9009f6a77b04a1dc70bdba87328a00ede2d68edf637af2daa90469587621078b not found: ID does not exist" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416315 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416517 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvkt4\" (UniqueName: \"kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416661 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.416786 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0\") pod \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\" (UID: \"040441ff-4e5b-4e97-aefa-01ebe3fe0720\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.424349 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.424812 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.425940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4" (OuterVolumeSpecName: "kube-api-access-gvkt4") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "kube-api-access-gvkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.473697 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.484660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.485156 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config" (OuterVolumeSpecName: "config") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.500315 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6sfg\" (UniqueName: \"kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519637 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.519755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd\") pod \"ac012962-724d-4075-b561-1b6a53a6d9f5\" (UID: \"ac012962-724d-4075-b561-1b6a53a6d9f5\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.520205 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.520221 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.520230 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvkt4\" (UniqueName: \"kubernetes.io/projected/040441ff-4e5b-4e97-aefa-01ebe3fe0720-kube-api-access-gvkt4\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.520240 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.520250 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.523788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg" (OuterVolumeSpecName: "kube-api-access-j6sfg") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "kube-api-access-j6sfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.530032 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts" (OuterVolumeSpecName: "scripts") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.530359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.530639 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.596350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "040441ff-4e5b-4e97-aefa-01ebe3fe0720" (UID: "040441ff-4e5b-4e97-aefa-01ebe3fe0720"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.636027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663832 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/040441ff-4e5b-4e97-aefa-01ebe3fe0720-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663867 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663878 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663887 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac012962-724d-4075-b561-1b6a53a6d9f5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6sfg\" (UniqueName: \"kubernetes.io/projected/ac012962-724d-4075-b561-1b6a53a6d9f5-kube-api-access-j6sfg\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.663905 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.748898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.765248 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.829900 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data" (OuterVolumeSpecName: "config-data") pod "ac012962-724d-4075-b561-1b6a53a6d9f5" (UID: "ac012962-724d-4075-b561-1b6a53a6d9f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.833589 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.833636 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.833835 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-log" containerID="cri-o://9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" gracePeriod=30 Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.834008 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-metadata" containerID="cri-o://e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" gracePeriod=30 Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.843780 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.867508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx4f4\" (UniqueName: \"kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4\") pod \"cbd1176f-8fbf-442a-98ed-293aff954480\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.867551 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data\") pod \"cbd1176f-8fbf-442a-98ed-293aff954480\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.867705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts\") pod \"cbd1176f-8fbf-442a-98ed-293aff954480\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.867851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle\") pod \"cbd1176f-8fbf-442a-98ed-293aff954480\" (UID: \"cbd1176f-8fbf-442a-98ed-293aff954480\") " Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.868349 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac012962-724d-4075-b561-1b6a53a6d9f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.868823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.873776 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4" (OuterVolumeSpecName: "kube-api-access-mx4f4") pod "cbd1176f-8fbf-442a-98ed-293aff954480" (UID: "cbd1176f-8fbf-442a-98ed-293aff954480"). InnerVolumeSpecName "kube-api-access-mx4f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.876528 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56d54d44c7-6m772"] Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.877336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts" (OuterVolumeSpecName: "scripts") pod "cbd1176f-8fbf-442a-98ed-293aff954480" (UID: "cbd1176f-8fbf-442a-98ed-293aff954480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.910623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbd1176f-8fbf-442a-98ed-293aff954480" (UID: "cbd1176f-8fbf-442a-98ed-293aff954480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.939604 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data" (OuterVolumeSpecName: "config-data") pod "cbd1176f-8fbf-442a-98ed-293aff954480" (UID: "cbd1176f-8fbf-442a-98ed-293aff954480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.969402 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.969437 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.969451 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx4f4\" (UniqueName: \"kubernetes.io/projected/cbd1176f-8fbf-442a-98ed-293aff954480-kube-api-access-mx4f4\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:02 crc kubenswrapper[4764]: I1204 00:03:02.969461 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbd1176f-8fbf-442a-98ed-293aff954480-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.117483 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.323962 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.345946 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerID="e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" exitCode=0 Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.345990 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerID="9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" exitCode=143 Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.346046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerDied","Data":"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419"} Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.346079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerDied","Data":"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e"} Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.346093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f32b26c-6ee6-4507-8db5-abd49ed04f8c","Type":"ContainerDied","Data":"1ed2ea876d387f2cb12bcfd86fb5a6b47f4cc0996332f6846411bc2e567a7e1d"} Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.346049 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.346112 4764 scope.go:117] "RemoveContainer" containerID="e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.351349 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h7p67" event={"ID":"cbd1176f-8fbf-442a-98ed-293aff954480","Type":"ContainerDied","Data":"200e292e64e31e4701deafe1a30b530992989d8065816a0dfd5a088bf8e527d1"} Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.351372 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="200e292e64e31e4701deafe1a30b530992989d8065816a0dfd5a088bf8e527d1" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.351476 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h7p67" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.356165 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-log" containerID="cri-o://6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36" gracePeriod=30 Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.356564 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.358872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac012962-724d-4075-b561-1b6a53a6d9f5","Type":"ContainerDied","Data":"e21339d5d17f776663e1043f5ed920063a8f3e72a5e7e9273fcae4ceabf92fa3"} Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.359614 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-api" containerID="cri-o://a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3" gracePeriod=30 Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.375634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data\") pod \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.375731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkfpp\" (UniqueName: \"kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp\") pod \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.375829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs\") pod \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.375878 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle\") pod \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.375901 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs\") pod \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\" (UID: \"3f32b26c-6ee6-4507-8db5-abd49ed04f8c\") " Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.384020 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs" (OuterVolumeSpecName: "logs") pod "3f32b26c-6ee6-4507-8db5-abd49ed04f8c" (UID: "3f32b26c-6ee6-4507-8db5-abd49ed04f8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.403594 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp" (OuterVolumeSpecName: "kube-api-access-tkfpp") pod "3f32b26c-6ee6-4507-8db5-abd49ed04f8c" (UID: "3f32b26c-6ee6-4507-8db5-abd49ed04f8c"). InnerVolumeSpecName "kube-api-access-tkfpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.406325 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407193 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-metadata" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407212 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-metadata" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407233 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="init" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407241 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="init" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407257 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-central-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407263 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-central-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407276 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="dnsmasq-dns" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407282 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="dnsmasq-dns" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407292 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-log" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407298 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-log" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407311 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="proxy-httpd" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407317 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="proxy-httpd" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407325 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbd1176f-8fbf-442a-98ed-293aff954480" containerName="nova-cell1-conductor-db-sync" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407331 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbd1176f-8fbf-442a-98ed-293aff954480" containerName="nova-cell1-conductor-db-sync" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407344 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-notification-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407351 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-notification-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407362 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d131b557-f02e-4925-a9eb-52202bce1b00" containerName="nova-manage" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407368 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d131b557-f02e-4925-a9eb-52202bce1b00" containerName="nova-manage" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.407377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="sg-core" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407384 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="sg-core" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407612 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-central-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407628 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" containerName="dnsmasq-dns" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407635 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="sg-core" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407641 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-log" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407648 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" containerName="nova-metadata-metadata" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407662 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbd1176f-8fbf-442a-98ed-293aff954480" containerName="nova-cell1-conductor-db-sync" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407671 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d131b557-f02e-4925-a9eb-52202bce1b00" containerName="nova-manage" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407682 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="ceilometer-notification-agent" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.407692 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" containerName="proxy-httpd" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.409195 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.412294 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.415877 4764 scope.go:117] "RemoveContainer" containerID="9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.425126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data" (OuterVolumeSpecName: "config-data") pod "3f32b26c-6ee6-4507-8db5-abd49ed04f8c" (UID: "3f32b26c-6ee6-4507-8db5-abd49ed04f8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.434757 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.448919 4764 scope.go:117] "RemoveContainer" containerID="e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.454926 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419\": container with ID starting with e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419 not found: ID does not exist" containerID="e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.454970 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419"} err="failed to get container status \"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419\": rpc error: code = NotFound desc = could not find container \"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419\": container with ID starting with e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419 not found: ID does not exist" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.455009 4764 scope.go:117] "RemoveContainer" containerID="9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" Dec 04 00:03:03 crc kubenswrapper[4764]: E1204 00:03:03.458686 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e\": container with ID starting with 9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e not found: ID does not exist" containerID="9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.458766 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e"} err="failed to get container status \"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e\": rpc error: code = NotFound desc = could not find container \"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e\": container with ID starting with 9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e not found: ID does not exist" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.458787 4764 scope.go:117] "RemoveContainer" containerID="e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.460364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419"} err="failed to get container status \"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419\": rpc error: code = NotFound desc = could not find container \"e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419\": container with ID starting with e71f1f4fedec6f4b53f65a8fea9d96f092d94b4db94e2ac9cb36f41a06ab2419 not found: ID does not exist" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.460384 4764 scope.go:117] "RemoveContainer" containerID="9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.461342 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e"} err="failed to get container status \"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e\": rpc error: code = NotFound desc = could not find container \"9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e\": container with ID starting with 9f265825f9c966c186b3bbe76b573eb78245c12d9e188543267fdcf914c0353e not found: ID does not exist" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.461385 4764 scope.go:117] "RemoveContainer" containerID="81cdb76ca3b2d742d82a8ad9d4b0512affb4a0653f0db484dfa7546997e003ab" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.465895 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f32b26c-6ee6-4507-8db5-abd49ed04f8c" (UID: "3f32b26c-6ee6-4507-8db5-abd49ed04f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.471291 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-992lt\" (UniqueName: \"kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479747 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479801 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479813 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkfpp\" (UniqueName: \"kubernetes.io/projected/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-kube-api-access-tkfpp\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479823 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.479831 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.484797 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.498924 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.500116 4764 scope.go:117] "RemoveContainer" containerID="9462a871d3db42e08d77cec747102f2b4ac4c07d2a1b2cfd83ecbf72c52a60a0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.505444 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.508353 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.514259 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.516843 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3f32b26c-6ee6-4507-8db5-abd49ed04f8c" (UID: "3f32b26c-6ee6-4507-8db5-abd49ed04f8c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.516962 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.516991 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.565091 4764 scope.go:117] "RemoveContainer" containerID="95e17231a256246bc718647605db323899aea854db67811fb3cec8329bbbd766" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.581654 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.581797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.581864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7pbl\" (UniqueName: \"kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.581886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.581996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-992lt\" (UniqueName: \"kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582176 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.582387 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f32b26c-6ee6-4507-8db5-abd49ed04f8c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.586088 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.588426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.594233 4764 scope.go:117] "RemoveContainer" containerID="f33db2316aaf8ebec702e3b5eba9bd15ed4374529daed1a43ad878fcd96daee0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.600986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-992lt\" (UniqueName: \"kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt\") pod \"nova-cell1-conductor-0\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.683884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7pbl\" (UniqueName: \"kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.683927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.683971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.684001 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.684036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.684071 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.684112 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.684143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.686117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.688066 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.689938 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.691340 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.691600 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.692394 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.692503 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.694602 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.702311 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.716349 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7pbl\" (UniqueName: \"kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl\") pod \"ceilometer-0\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.722927 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.724566 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.728534 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.728781 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.732128 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.747068 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.830456 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.888148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.888213 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.888275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcnbw\" (UniqueName: \"kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.888317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.888711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.992187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcnbw\" (UniqueName: \"kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.992246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.992408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.992436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.992464 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.993344 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.998450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:03 crc kubenswrapper[4764]: I1204 00:03:03.998882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.002318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.015315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcnbw\" (UniqueName: \"kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw\") pod \"nova-metadata-0\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " pod="openstack/nova-metadata-0" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.167588 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.262987 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.374703 4764 generic.go:334] "Generic (PLEG): container finished" podID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerID="6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36" exitCode=143 Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.374786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerDied","Data":"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36"} Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.377994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"155f4570-7769-42ab-8bc0-168dba070531","Type":"ContainerStarted","Data":"0fc61c784df839e37e29bf8736b71d7cfdfc5c4e29c5d9138a7389d570e58bb4"} Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.379573 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerName="nova-scheduler-scheduler" containerID="cri-o://dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" gracePeriod=30 Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.393767 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.608364 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="040441ff-4e5b-4e97-aefa-01ebe3fe0720" path="/var/lib/kubelet/pods/040441ff-4e5b-4e97-aefa-01ebe3fe0720/volumes" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.610358 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f32b26c-6ee6-4507-8db5-abd49ed04f8c" path="/var/lib/kubelet/pods/3f32b26c-6ee6-4507-8db5-abd49ed04f8c/volumes" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.611753 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac012962-724d-4075-b561-1b6a53a6d9f5" path="/var/lib/kubelet/pods/ac012962-724d-4075-b561-1b6a53a6d9f5/volumes" Dec 04 00:03:04 crc kubenswrapper[4764]: I1204 00:03:04.705396 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:04 crc kubenswrapper[4764]: W1204 00:03:04.708770 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod982c595a_a3c1_4f3b_aca5_6e6e6cd52e94.slice/crio-978060570c2e58cbad751dd4228a841bc1a84b6e0ec01b855e7576e4d353d667 WatchSource:0}: Error finding container 978060570c2e58cbad751dd4228a841bc1a84b6e0ec01b855e7576e4d353d667: Status 404 returned error can't find the container with id 978060570c2e58cbad751dd4228a841bc1a84b6e0ec01b855e7576e4d353d667 Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.390704 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerStarted","Data":"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.391071 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerStarted","Data":"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.391083 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerStarted","Data":"978060570c2e58cbad751dd4228a841bc1a84b6e0ec01b855e7576e4d353d667"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.392509 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"155f4570-7769-42ab-8bc0-168dba070531","Type":"ContainerStarted","Data":"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.392628 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.394033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerStarted","Data":"b480f7cf733d3fdb92024fcd70f4360b1f6d06c5af7e5307d7add5e3b3ce7737"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.394058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerStarted","Data":"5213b20f43856c2dfe75fbb94d86e466d407c242991bd132892a352b46db5dcb"} Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.429057 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.429040331 podStartE2EDuration="2.429040331s" podCreationTimestamp="2025-12-04 00:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:05.412336201 +0000 UTC m=+1321.173660632" watchObservedRunningTime="2025-12-04 00:03:05.429040331 +0000 UTC m=+1321.190364742" Dec 04 00:03:05 crc kubenswrapper[4764]: I1204 00:03:05.432878 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.432867124 podStartE2EDuration="2.432867124s" podCreationTimestamp="2025-12-04 00:03:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:05.425549425 +0000 UTC m=+1321.186873836" watchObservedRunningTime="2025-12-04 00:03:05.432867124 +0000 UTC m=+1321.194191535" Dec 04 00:03:06 crc kubenswrapper[4764]: I1204 00:03:06.409811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerStarted","Data":"8d01af41c07ead7dd15c27b9dfc408606d86ce0d2c7f2669b70e45b21075d02f"} Dec 04 00:03:06 crc kubenswrapper[4764]: E1204 00:03:06.625814 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:06 crc kubenswrapper[4764]: E1204 00:03:06.627540 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:06 crc kubenswrapper[4764]: E1204 00:03:06.629368 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:06 crc kubenswrapper[4764]: E1204 00:03:06.629412 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerName="nova-scheduler-scheduler" Dec 04 00:03:07 crc kubenswrapper[4764]: I1204 00:03:07.426840 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerStarted","Data":"a09d575229019e7f1d04320e61e6e9f99e868736a8f3abad3709cca28a408a4c"} Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.434182 4764 generic.go:334] "Generic (PLEG): container finished" podID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerID="dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" exitCode=0 Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.434338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"829e611e-4d52-4a09-b7a4-56d64bf2b892","Type":"ContainerDied","Data":"dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb"} Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.434762 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"829e611e-4d52-4a09-b7a4-56d64bf2b892","Type":"ContainerDied","Data":"71a0a2beb5d4572d8df3e17fff62989d8d477987046d72c0641788a2ad32b69f"} Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.434781 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a0a2beb5d4572d8df3e17fff62989d8d477987046d72c0641788a2ad32b69f" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.436841 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerStarted","Data":"e29d2ddf9198dd8d757a06f6f7796abcd1931121be32d0495aefee47548f9d85"} Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.437036 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.458873 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.479304 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.192264865 podStartE2EDuration="5.479283082s" podCreationTimestamp="2025-12-04 00:03:03 +0000 UTC" firstStartedPulling="2025-12-04 00:03:04.413847243 +0000 UTC m=+1320.175171654" lastFinishedPulling="2025-12-04 00:03:07.70086546 +0000 UTC m=+1323.462189871" observedRunningTime="2025-12-04 00:03:08.465930684 +0000 UTC m=+1324.227255105" watchObservedRunningTime="2025-12-04 00:03:08.479283082 +0000 UTC m=+1324.240607513" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.582898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data\") pod \"829e611e-4d52-4a09-b7a4-56d64bf2b892\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.583195 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf6z5\" (UniqueName: \"kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5\") pod \"829e611e-4d52-4a09-b7a4-56d64bf2b892\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.583270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") pod \"829e611e-4d52-4a09-b7a4-56d64bf2b892\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.589339 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5" (OuterVolumeSpecName: "kube-api-access-vf6z5") pod "829e611e-4d52-4a09-b7a4-56d64bf2b892" (UID: "829e611e-4d52-4a09-b7a4-56d64bf2b892"). InnerVolumeSpecName "kube-api-access-vf6z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:08 crc kubenswrapper[4764]: E1204 00:03:08.610504 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle podName:829e611e-4d52-4a09-b7a4-56d64bf2b892 nodeName:}" failed. No retries permitted until 2025-12-04 00:03:09.11047973 +0000 UTC m=+1324.871804141 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle") pod "829e611e-4d52-4a09-b7a4-56d64bf2b892" (UID: "829e611e-4d52-4a09-b7a4-56d64bf2b892") : error deleting /var/lib/kubelet/pods/829e611e-4d52-4a09-b7a4-56d64bf2b892/volume-subpaths: remove /var/lib/kubelet/pods/829e611e-4d52-4a09-b7a4-56d64bf2b892/volume-subpaths: no such file or directory Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.613970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data" (OuterVolumeSpecName: "config-data") pod "829e611e-4d52-4a09-b7a4-56d64bf2b892" (UID: "829e611e-4d52-4a09-b7a4-56d64bf2b892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.685235 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf6z5\" (UniqueName: \"kubernetes.io/projected/829e611e-4d52-4a09-b7a4-56d64bf2b892-kube-api-access-vf6z5\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:08 crc kubenswrapper[4764]: I1204 00:03:08.685274 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.168196 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.168501 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.192268 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") pod \"829e611e-4d52-4a09-b7a4-56d64bf2b892\" (UID: \"829e611e-4d52-4a09-b7a4-56d64bf2b892\") " Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.197301 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "829e611e-4d52-4a09-b7a4-56d64bf2b892" (UID: "829e611e-4d52-4a09-b7a4-56d64bf2b892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.258047 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.295170 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829e611e-4d52-4a09-b7a4-56d64bf2b892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.396856 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") pod \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.396985 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwsn9\" (UniqueName: \"kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9\") pod \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.397061 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data\") pod \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.397115 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs\") pod \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.397527 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs" (OuterVolumeSpecName: "logs") pod "8195bb2e-e6a6-467c-90f3-a89f1dffe077" (UID: "8195bb2e-e6a6-467c-90f3-a89f1dffe077"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.399951 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9" (OuterVolumeSpecName: "kube-api-access-wwsn9") pod "8195bb2e-e6a6-467c-90f3-a89f1dffe077" (UID: "8195bb2e-e6a6-467c-90f3-a89f1dffe077"). InnerVolumeSpecName "kube-api-access-wwsn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.417272 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle podName:8195bb2e-e6a6-467c-90f3-a89f1dffe077 nodeName:}" failed. No retries permitted until 2025-12-04 00:03:09.917242706 +0000 UTC m=+1325.678567117 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle") pod "8195bb2e-e6a6-467c-90f3-a89f1dffe077" (UID: "8195bb2e-e6a6-467c-90f3-a89f1dffe077") : error deleting /var/lib/kubelet/pods/8195bb2e-e6a6-467c-90f3-a89f1dffe077/volume-subpaths: remove /var/lib/kubelet/pods/8195bb2e-e6a6-467c-90f3-a89f1dffe077/volume-subpaths: no such file or directory Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.420997 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data" (OuterVolumeSpecName: "config-data") pod "8195bb2e-e6a6-467c-90f3-a89f1dffe077" (UID: "8195bb2e-e6a6-467c-90f3-a89f1dffe077"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460063 4764 generic.go:334] "Generic (PLEG): container finished" podID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerID="a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3" exitCode=0 Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460118 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerDied","Data":"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3"} Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8195bb2e-e6a6-467c-90f3-a89f1dffe077","Type":"ContainerDied","Data":"8877b3f167d5b3dac35e796966255316f2264c388405a9db45f847845b16f206"} Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460204 4764 scope.go:117] "RemoveContainer" containerID="a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.460324 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.493610 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.498690 4764 scope.go:117] "RemoveContainer" containerID="6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.499716 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.499744 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8195bb2e-e6a6-467c-90f3-a89f1dffe077-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.499754 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwsn9\" (UniqueName: \"kubernetes.io/projected/8195bb2e-e6a6-467c-90f3-a89f1dffe077-kube-api-access-wwsn9\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.502321 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.523808 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.524394 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-api" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.524602 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-api" Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.524711 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerName="nova-scheduler-scheduler" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.524805 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerName="nova-scheduler-scheduler" Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.524922 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-log" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.524997 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-log" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.525306 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-log" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.525427 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" containerName="nova-scheduler-scheduler" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.525524 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" containerName="nova-api-api" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.526251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.529322 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.535574 4764 scope.go:117] "RemoveContainer" containerID="a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3" Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.536127 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3\": container with ID starting with a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3 not found: ID does not exist" containerID="a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.536167 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3"} err="failed to get container status \"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3\": rpc error: code = NotFound desc = could not find container \"a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3\": container with ID starting with a0a45e3fea63d6b0f6db65e14458e88e8e5afd9dbc54740e50b007d6331ec7b3 not found: ID does not exist" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.536193 4764 scope.go:117] "RemoveContainer" containerID="6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36" Dec 04 00:03:09 crc kubenswrapper[4764]: E1204 00:03:09.537126 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36\": container with ID starting with 6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36 not found: ID does not exist" containerID="6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.537611 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36"} err="failed to get container status \"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36\": rpc error: code = NotFound desc = could not find container \"6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36\": container with ID starting with 6082e10a42118ae2e140582b20ca1698c17d680b36282be92313a9fa63ca7a36 not found: ID does not exist" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.546517 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.678201 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.702539 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpjb\" (UniqueName: \"kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.702638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.702700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.803986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.804081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.804176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpjb\" (UniqueName: \"kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.807594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.810003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.821340 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpjb\" (UniqueName: \"kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb\") pod \"nova-scheduler-0\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:09 crc kubenswrapper[4764]: I1204 00:03:09.851902 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.007424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") pod \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\" (UID: \"8195bb2e-e6a6-467c-90f3-a89f1dffe077\") " Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.028985 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8195bb2e-e6a6-467c-90f3-a89f1dffe077" (UID: "8195bb2e-e6a6-467c-90f3-a89f1dffe077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.098768 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.110062 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8195bb2e-e6a6-467c-90f3-a89f1dffe077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.138042 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.158821 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.160953 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.163271 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.166028 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.279001 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: W1204 00:03:10.280808 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcd93b6_05f9_4027_ab36_b33da560ef64.slice/crio-9b7b63bcea5389e4c9fc4a702e9dac1631f26cea7fb0ea8d1c75aa12ddcf7fa9 WatchSource:0}: Error finding container 9b7b63bcea5389e4c9fc4a702e9dac1631f26cea7fb0ea8d1c75aa12ddcf7fa9: Status 404 returned error can't find the container with id 9b7b63bcea5389e4c9fc4a702e9dac1631f26cea7fb0ea8d1c75aa12ddcf7fa9 Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.314258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.314397 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzft2\" (UniqueName: \"kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.314487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.314559 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.416150 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzft2\" (UniqueName: \"kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.416435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.416619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.416754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.417314 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.420831 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.421121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.436832 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzft2\" (UniqueName: \"kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2\") pod \"nova-api-0\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.476659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dcd93b6-05f9-4027-ab36-b33da560ef64","Type":"ContainerStarted","Data":"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31"} Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.476710 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dcd93b6-05f9-4027-ab36-b33da560ef64","Type":"ContainerStarted","Data":"9b7b63bcea5389e4c9fc4a702e9dac1631f26cea7fb0ea8d1c75aa12ddcf7fa9"} Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.485766 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.496911 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.496895966 podStartE2EDuration="1.496895966s" podCreationTimestamp="2025-12-04 00:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:10.491872622 +0000 UTC m=+1326.253197033" watchObservedRunningTime="2025-12-04 00:03:10.496895966 +0000 UTC m=+1326.258220377" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.555626 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8195bb2e-e6a6-467c-90f3-a89f1dffe077" path="/var/lib/kubelet/pods/8195bb2e-e6a6-467c-90f3-a89f1dffe077/volumes" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.556372 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="829e611e-4d52-4a09-b7a4-56d64bf2b892" path="/var/lib/kubelet/pods/829e611e-4d52-4a09-b7a4-56d64bf2b892/volumes" Dec 04 00:03:10 crc kubenswrapper[4764]: I1204 00:03:10.914956 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:10 crc kubenswrapper[4764]: W1204 00:03:10.916136 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5388114f_f7a5_4fd8_9cc4_71cf7a5eb53a.slice/crio-bb4e2830fd5972fd782edde3400423422cb991227dce815a950bc0379251b226 WatchSource:0}: Error finding container bb4e2830fd5972fd782edde3400423422cb991227dce815a950bc0379251b226: Status 404 returned error can't find the container with id bb4e2830fd5972fd782edde3400423422cb991227dce815a950bc0379251b226 Dec 04 00:03:11 crc kubenswrapper[4764]: I1204 00:03:11.485050 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerStarted","Data":"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f"} Dec 04 00:03:11 crc kubenswrapper[4764]: I1204 00:03:11.485377 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerStarted","Data":"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a"} Dec 04 00:03:11 crc kubenswrapper[4764]: I1204 00:03:11.485388 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerStarted","Data":"bb4e2830fd5972fd782edde3400423422cb991227dce815a950bc0379251b226"} Dec 04 00:03:11 crc kubenswrapper[4764]: I1204 00:03:11.504535 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.504520369 podStartE2EDuration="1.504520369s" podCreationTimestamp="2025-12-04 00:03:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:11.502868829 +0000 UTC m=+1327.264193240" watchObservedRunningTime="2025-12-04 00:03:11.504520369 +0000 UTC m=+1327.265844780" Dec 04 00:03:13 crc kubenswrapper[4764]: I1204 00:03:13.783627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 00:03:14 crc kubenswrapper[4764]: I1204 00:03:14.168760 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 00:03:14 crc kubenswrapper[4764]: I1204 00:03:14.169133 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 00:03:14 crc kubenswrapper[4764]: I1204 00:03:14.852346 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 00:03:15 crc kubenswrapper[4764]: I1204 00:03:15.180924 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:15 crc kubenswrapper[4764]: I1204 00:03:15.180945 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:19 crc kubenswrapper[4764]: I1204 00:03:19.852638 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 00:03:19 crc kubenswrapper[4764]: I1204 00:03:19.901346 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 00:03:20 crc kubenswrapper[4764]: I1204 00:03:20.486644 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:03:20 crc kubenswrapper[4764]: I1204 00:03:20.487004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:03:20 crc kubenswrapper[4764]: I1204 00:03:20.610653 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 00:03:21 crc kubenswrapper[4764]: I1204 00:03:21.568920 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:21 crc kubenswrapper[4764]: I1204 00:03:21.568980 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:03:24 crc kubenswrapper[4764]: I1204 00:03:24.175304 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 00:03:24 crc kubenswrapper[4764]: I1204 00:03:24.177008 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 00:03:24 crc kubenswrapper[4764]: I1204 00:03:24.180747 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 00:03:24 crc kubenswrapper[4764]: I1204 00:03:24.181315 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.630365 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.652508 4764 generic.go:334] "Generic (PLEG): container finished" podID="10778b9d-7c88-4b04-adbc-ad78521192dd" containerID="ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a" exitCode=137 Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.652565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"10778b9d-7c88-4b04-adbc-ad78521192dd","Type":"ContainerDied","Data":"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a"} Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.652599 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"10778b9d-7c88-4b04-adbc-ad78521192dd","Type":"ContainerDied","Data":"b6dc7ca814500f89896454c22cecd077989d5fbbccdf024f3d05e80945efdbc9"} Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.652648 4764 scope.go:117] "RemoveContainer" containerID="ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.652672 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.679901 4764 scope.go:117] "RemoveContainer" containerID="ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a" Dec 04 00:03:27 crc kubenswrapper[4764]: E1204 00:03:27.680649 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a\": container with ID starting with ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a not found: ID does not exist" containerID="ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.680690 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a"} err="failed to get container status \"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a\": rpc error: code = NotFound desc = could not find container \"ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a\": container with ID starting with ebb5765d5db87bfc5a07756f028bf0c447d167a1d014d390af0fa4dc431c1e2a not found: ID does not exist" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.824474 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle\") pod \"10778b9d-7c88-4b04-adbc-ad78521192dd\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.824574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljskp\" (UniqueName: \"kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp\") pod \"10778b9d-7c88-4b04-adbc-ad78521192dd\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.824686 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data\") pod \"10778b9d-7c88-4b04-adbc-ad78521192dd\" (UID: \"10778b9d-7c88-4b04-adbc-ad78521192dd\") " Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.830235 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp" (OuterVolumeSpecName: "kube-api-access-ljskp") pod "10778b9d-7c88-4b04-adbc-ad78521192dd" (UID: "10778b9d-7c88-4b04-adbc-ad78521192dd"). InnerVolumeSpecName "kube-api-access-ljskp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.850931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data" (OuterVolumeSpecName: "config-data") pod "10778b9d-7c88-4b04-adbc-ad78521192dd" (UID: "10778b9d-7c88-4b04-adbc-ad78521192dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.851461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10778b9d-7c88-4b04-adbc-ad78521192dd" (UID: "10778b9d-7c88-4b04-adbc-ad78521192dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.926701 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.926751 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljskp\" (UniqueName: \"kubernetes.io/projected/10778b9d-7c88-4b04-adbc-ad78521192dd-kube-api-access-ljskp\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.926768 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10778b9d-7c88-4b04-adbc-ad78521192dd-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.987365 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:03:27 crc kubenswrapper[4764]: I1204 00:03:27.995012 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.019130 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:03:28 crc kubenswrapper[4764]: E1204 00:03:28.019760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10778b9d-7c88-4b04-adbc-ad78521192dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.019861 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="10778b9d-7c88-4b04-adbc-ad78521192dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.020112 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="10778b9d-7c88-4b04-adbc-ad78521192dd" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.020821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.022971 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.023288 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.023631 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.036443 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.130682 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.130746 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sck9\" (UniqueName: \"kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.130824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.130913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.130951 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.235517 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.235557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sck9\" (UniqueName: \"kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.235627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.235703 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.235754 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.240868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.241053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.241306 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.241815 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.259124 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sck9\" (UniqueName: \"kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9\") pod \"nova-cell1-novncproxy-0\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.345227 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.626180 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10778b9d-7c88-4b04-adbc-ad78521192dd" path="/var/lib/kubelet/pods/10778b9d-7c88-4b04-adbc-ad78521192dd/volumes" Dec 04 00:03:28 crc kubenswrapper[4764]: I1204 00:03:28.774238 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:03:28 crc kubenswrapper[4764]: W1204 00:03:28.791983 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74af5cde_29d3_4ff7_803b_fb335fc8209c.slice/crio-fdccedd548d43aa1a4310ba8a3024c12953b9bef6344491d42091024a15ca257 WatchSource:0}: Error finding container fdccedd548d43aa1a4310ba8a3024c12953b9bef6344491d42091024a15ca257: Status 404 returned error can't find the container with id fdccedd548d43aa1a4310ba8a3024c12953b9bef6344491d42091024a15ca257 Dec 04 00:03:29 crc kubenswrapper[4764]: I1204 00:03:29.724530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74af5cde-29d3-4ff7-803b-fb335fc8209c","Type":"ContainerStarted","Data":"49b1d45450edbc5e515da4d8f049c2433b6679ef6419dc9e04e61f99fccf319b"} Dec 04 00:03:29 crc kubenswrapper[4764]: I1204 00:03:29.725114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74af5cde-29d3-4ff7-803b-fb335fc8209c","Type":"ContainerStarted","Data":"fdccedd548d43aa1a4310ba8a3024c12953b9bef6344491d42091024a15ca257"} Dec 04 00:03:29 crc kubenswrapper[4764]: I1204 00:03:29.747356 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.747342755 podStartE2EDuration="2.747342755s" podCreationTimestamp="2025-12-04 00:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:29.74594299 +0000 UTC m=+1345.507267431" watchObservedRunningTime="2025-12-04 00:03:29.747342755 +0000 UTC m=+1345.508667166" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.489280 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.489747 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.490973 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.492301 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.733297 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.739166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.944433 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.949573 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.961443 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2qtl\" (UniqueName: \"kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995436 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995507 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:30 crc kubenswrapper[4764]: I1204 00:03:30.995708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097566 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2qtl\" (UniqueName: \"kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097679 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.097761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.098226 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.098608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.098652 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.098978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.099333 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.124775 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2qtl\" (UniqueName: \"kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl\") pod \"dnsmasq-dns-7f9fbbf6f7-znhdg\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.276038 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:31 crc kubenswrapper[4764]: W1204 00:03:31.752388 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6152d07_38d3_42e7_953f_d9747b1f8996.slice/crio-0f0dd0efe85778b87bfd31a1306fdfebd9558b21164f1bb2bd1dc32836c34c69 WatchSource:0}: Error finding container 0f0dd0efe85778b87bfd31a1306fdfebd9558b21164f1bb2bd1dc32836c34c69: Status 404 returned error can't find the container with id 0f0dd0efe85778b87bfd31a1306fdfebd9558b21164f1bb2bd1dc32836c34c69 Dec 04 00:03:31 crc kubenswrapper[4764]: I1204 00:03:31.754695 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.752570 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerID="1a9344597699f53d84c39d5a1632513c0b73908012de706938aa4ae664672a32" exitCode=0 Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.752662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" event={"ID":"e6152d07-38d3-42e7-953f-d9747b1f8996","Type":"ContainerDied","Data":"1a9344597699f53d84c39d5a1632513c0b73908012de706938aa4ae664672a32"} Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.752917 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" event={"ID":"e6152d07-38d3-42e7-953f-d9747b1f8996","Type":"ContainerStarted","Data":"0f0dd0efe85778b87bfd31a1306fdfebd9558b21164f1bb2bd1dc32836c34c69"} Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.777153 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.777433 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-central-agent" containerID="cri-o://b480f7cf733d3fdb92024fcd70f4360b1f6d06c5af7e5307d7add5e3b3ce7737" gracePeriod=30 Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.777920 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" containerID="cri-o://e29d2ddf9198dd8d757a06f6f7796abcd1931121be32d0495aefee47548f9d85" gracePeriod=30 Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.777989 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="sg-core" containerID="cri-o://a09d575229019e7f1d04320e61e6e9f99e868736a8f3abad3709cca28a408a4c" gracePeriod=30 Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.777992 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-notification-agent" containerID="cri-o://8d01af41c07ead7dd15c27b9dfc408606d86ce0d2c7f2669b70e45b21075d02f" gracePeriod=30 Dec 04 00:03:32 crc kubenswrapper[4764]: I1204 00:03:32.799567 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.345973 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.770635 4764 generic.go:334] "Generic (PLEG): container finished" podID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerID="e29d2ddf9198dd8d757a06f6f7796abcd1931121be32d0495aefee47548f9d85" exitCode=0 Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.771098 4764 generic.go:334] "Generic (PLEG): container finished" podID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerID="a09d575229019e7f1d04320e61e6e9f99e868736a8f3abad3709cca28a408a4c" exitCode=2 Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.771117 4764 generic.go:334] "Generic (PLEG): container finished" podID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerID="b480f7cf733d3fdb92024fcd70f4360b1f6d06c5af7e5307d7add5e3b3ce7737" exitCode=0 Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.770844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerDied","Data":"e29d2ddf9198dd8d757a06f6f7796abcd1931121be32d0495aefee47548f9d85"} Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.771192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerDied","Data":"a09d575229019e7f1d04320e61e6e9f99e868736a8f3abad3709cca28a408a4c"} Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.771216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerDied","Data":"b480f7cf733d3fdb92024fcd70f4360b1f6d06c5af7e5307d7add5e3b3ce7737"} Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.779200 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" event={"ID":"e6152d07-38d3-42e7-953f-d9747b1f8996","Type":"ContainerStarted","Data":"8af4acd3740dccbf24a933df87c06cee9c0c919aae4bd8f32b080872dd5b0c80"} Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.779414 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.814634 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" podStartSLOduration=3.814617458 podStartE2EDuration="3.814617458s" podCreationTimestamp="2025-12-04 00:03:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:33.810413405 +0000 UTC m=+1349.571737826" watchObservedRunningTime="2025-12-04 00:03:33.814617458 +0000 UTC m=+1349.575941879" Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.832108 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.190:3000/\": dial tcp 10.217.0.190:3000: connect: connection refused" Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.854775 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.855737 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-log" containerID="cri-o://c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a" gracePeriod=30 Dec 04 00:03:33 crc kubenswrapper[4764]: I1204 00:03:33.855892 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-api" containerID="cri-o://5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f" gracePeriod=30 Dec 04 00:03:34 crc kubenswrapper[4764]: I1204 00:03:34.788416 4764 generic.go:334] "Generic (PLEG): container finished" podID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerID="c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a" exitCode=143 Dec 04 00:03:34 crc kubenswrapper[4764]: I1204 00:03:34.789539 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerDied","Data":"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a"} Dec 04 00:03:36 crc kubenswrapper[4764]: I1204 00:03:36.815514 4764 generic.go:334] "Generic (PLEG): container finished" podID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerID="8d01af41c07ead7dd15c27b9dfc408606d86ce0d2c7f2669b70e45b21075d02f" exitCode=0 Dec 04 00:03:36 crc kubenswrapper[4764]: I1204 00:03:36.815588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerDied","Data":"8d01af41c07ead7dd15c27b9dfc408606d86ce0d2c7f2669b70e45b21075d02f"} Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.206868 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340177 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340320 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7pbl\" (UniqueName: \"kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.340518 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml\") pod \"d6359cab-59b1-4d0f-828a-abf479b0efbb\" (UID: \"d6359cab-59b1-4d0f-828a-abf479b0efbb\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.341909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.342252 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.346569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts" (OuterVolumeSpecName: "scripts") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.347797 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl" (OuterVolumeSpecName: "kube-api-access-b7pbl") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "kube-api-access-b7pbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.368491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.394140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.410491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443357 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443772 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443805 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7pbl\" (UniqueName: \"kubernetes.io/projected/d6359cab-59b1-4d0f-828a-abf479b0efbb-kube-api-access-b7pbl\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443829 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443847 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443864 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6359cab-59b1-4d0f-828a-abf479b0efbb-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.443881 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.475675 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.483946 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data" (OuterVolumeSpecName: "config-data") pod "d6359cab-59b1-4d0f-828a-abf479b0efbb" (UID: "d6359cab-59b1-4d0f-828a-abf479b0efbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.544957 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data\") pod \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.545008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle\") pod \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.545029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzft2\" (UniqueName: \"kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2\") pod \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.545086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs\") pod \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\" (UID: \"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a\") " Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.545847 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6359cab-59b1-4d0f-828a-abf479b0efbb-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.546093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs" (OuterVolumeSpecName: "logs") pod "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" (UID: "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.549473 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2" (OuterVolumeSpecName: "kube-api-access-fzft2") pod "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" (UID: "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a"). InnerVolumeSpecName "kube-api-access-fzft2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.574759 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data" (OuterVolumeSpecName: "config-data") pod "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" (UID: "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.575167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" (UID: "5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.647512 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.647564 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.647576 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzft2\" (UniqueName: \"kubernetes.io/projected/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-kube-api-access-fzft2\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.647602 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.827093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6359cab-59b1-4d0f-828a-abf479b0efbb","Type":"ContainerDied","Data":"5213b20f43856c2dfe75fbb94d86e466d407c242991bd132892a352b46db5dcb"} Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.827154 4764 scope.go:117] "RemoveContainer" containerID="e29d2ddf9198dd8d757a06f6f7796abcd1931121be32d0495aefee47548f9d85" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.827106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.829899 4764 generic.go:334] "Generic (PLEG): container finished" podID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerID="5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f" exitCode=0 Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.829949 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.829946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerDied","Data":"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f"} Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.830125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a","Type":"ContainerDied","Data":"bb4e2830fd5972fd782edde3400423422cb991227dce815a950bc0379251b226"} Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.872167 4764 scope.go:117] "RemoveContainer" containerID="a09d575229019e7f1d04320e61e6e9f99e868736a8f3abad3709cca28a408a4c" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.885329 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.897530 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.907164 4764 scope.go:117] "RemoveContainer" containerID="8d01af41c07ead7dd15c27b9dfc408606d86ce0d2c7f2669b70e45b21075d02f" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.916855 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.930567 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.936997 4764 scope.go:117] "RemoveContainer" containerID="b480f7cf733d3fdb92024fcd70f4360b1f6d06c5af7e5307d7add5e3b3ce7737" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950147 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950621 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-log" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950643 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-log" Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950665 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-api" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950672 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-api" Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950688 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-notification-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950696 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-notification-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950706 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-central-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950730 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-central-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950750 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950759 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" Dec 04 00:03:37 crc kubenswrapper[4764]: E1204 00:03:37.950787 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="sg-core" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950794 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="sg-core" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.950997 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="proxy-httpd" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.951012 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-notification-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.951026 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-api" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.951049 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" containerName="nova-api-log" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.951065 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="sg-core" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.951081 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" containerName="ceilometer-central-agent" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.952993 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.954800 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.968131 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.968274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.970424 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.983805 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.987008 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.989724 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.993287 4764 scope.go:117] "RemoveContainer" containerID="5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.993373 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.993595 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 00:03:37 crc kubenswrapper[4764]: I1204 00:03:37.994624 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.019102 4764 scope.go:117] "RemoveContainer" containerID="c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.035704 4764 scope.go:117] "RemoveContainer" containerID="5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f" Dec 04 00:03:38 crc kubenswrapper[4764]: E1204 00:03:38.036021 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f\": container with ID starting with 5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f not found: ID does not exist" containerID="5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.036057 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f"} err="failed to get container status \"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f\": rpc error: code = NotFound desc = could not find container \"5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f\": container with ID starting with 5a0cef3d3efefa0e1cbeda45864c0c9154cf73164ae49cb925402f721f60337f not found: ID does not exist" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.036078 4764 scope.go:117] "RemoveContainer" containerID="c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a" Dec 04 00:03:38 crc kubenswrapper[4764]: E1204 00:03:38.036258 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a\": container with ID starting with c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a not found: ID does not exist" containerID="c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.036282 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a"} err="failed to get container status \"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a\": rpc error: code = NotFound desc = could not find container \"c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a\": container with ID starting with c7bdde8ca053009c0f399ea7b93c35118adb206b21e0ee5a97ef3d0380f84b5a not found: ID does not exist" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053377 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m57x\" (UniqueName: \"kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053617 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053929 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9r2j\" (UniqueName: \"kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.053980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.155864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.155902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.155946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.155970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9r2j\" (UniqueName: \"kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.155989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m57x\" (UniqueName: \"kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156136 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156151 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.156211 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.158506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.158773 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.160001 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.160402 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.161181 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.173744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.173695 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.173895 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.174061 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.174116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.174204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.176241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.176604 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m57x\" (UniqueName: \"kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x\") pod \"ceilometer-0\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.177214 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9r2j\" (UniqueName: \"kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j\") pod \"nova-api-0\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.280150 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.311666 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.346408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.374496 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.563845 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a" path="/var/lib/kubelet/pods/5388114f-f7a5-4fd8-9cc4-71cf7a5eb53a/volumes" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.565902 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6359cab-59b1-4d0f-828a-abf479b0efbb" path="/var/lib/kubelet/pods/d6359cab-59b1-4d0f-828a-abf479b0efbb/volumes" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.866291 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.871873 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:03:38 crc kubenswrapper[4764]: W1204 00:03:38.874079 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1179e07c_0222_4ae3_9bd2_81e2f5a6a6b9.slice/crio-489744868f9cf8cc833c6f6db95f4e28d2b503cec14f35fd87bb774c813b9b2b WatchSource:0}: Error finding container 489744868f9cf8cc833c6f6db95f4e28d2b503cec14f35fd87bb774c813b9b2b: Status 404 returned error can't find the container with id 489744868f9cf8cc833c6f6db95f4e28d2b503cec14f35fd87bb774c813b9b2b Dec 04 00:03:38 crc kubenswrapper[4764]: I1204 00:03:38.886016 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.224927 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4mnsn"] Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.226129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.234571 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.234754 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.238397 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4mnsn"] Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.406822 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8brrv\" (UniqueName: \"kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.407511 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.407785 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.407934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.509379 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.509449 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8brrv\" (UniqueName: \"kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.509488 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.509558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.513817 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.514147 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.515318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.532764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8brrv\" (UniqueName: \"kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv\") pod \"nova-cell1-cell-mapping-4mnsn\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.574517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.856407 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerStarted","Data":"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe"} Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.856677 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerStarted","Data":"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283"} Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.856689 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerStarted","Data":"489744868f9cf8cc833c6f6db95f4e28d2b503cec14f35fd87bb774c813b9b2b"} Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.870505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerStarted","Data":"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93"} Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.870690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerStarted","Data":"77d435b07ebd0f55f4c323c1297c8886838f47b04c073112c54fd6cb080e76cb"} Dec 04 00:03:39 crc kubenswrapper[4764]: I1204 00:03:39.881281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.881256799 podStartE2EDuration="2.881256799s" podCreationTimestamp="2025-12-04 00:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:39.875757354 +0000 UTC m=+1355.637081765" watchObservedRunningTime="2025-12-04 00:03:39.881256799 +0000 UTC m=+1355.642581220" Dec 04 00:03:40 crc kubenswrapper[4764]: I1204 00:03:40.051254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4mnsn"] Dec 04 00:03:40 crc kubenswrapper[4764]: W1204 00:03:40.060921 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd15434b3_f4f4_4e53_8ddd_db0df89aca8a.slice/crio-f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e WatchSource:0}: Error finding container f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e: Status 404 returned error can't find the container with id f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e Dec 04 00:03:40 crc kubenswrapper[4764]: I1204 00:03:40.879424 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4mnsn" event={"ID":"d15434b3-f4f4-4e53-8ddd-db0df89aca8a","Type":"ContainerStarted","Data":"af3ad162775dbb086e4ee57f257df51f07acc80478f78e561812ab0e3ff974ec"} Dec 04 00:03:40 crc kubenswrapper[4764]: I1204 00:03:40.879743 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4mnsn" event={"ID":"d15434b3-f4f4-4e53-8ddd-db0df89aca8a","Type":"ContainerStarted","Data":"f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e"} Dec 04 00:03:40 crc kubenswrapper[4764]: I1204 00:03:40.883254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerStarted","Data":"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81"} Dec 04 00:03:40 crc kubenswrapper[4764]: I1204 00:03:40.883303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerStarted","Data":"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db"} Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.277886 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.327845 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4mnsn" podStartSLOduration=2.327824467 podStartE2EDuration="2.327824467s" podCreationTimestamp="2025-12-04 00:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:40.917181596 +0000 UTC m=+1356.678506007" watchObservedRunningTime="2025-12-04 00:03:41.327824467 +0000 UTC m=+1357.089148888" Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.357558 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.357835 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="dnsmasq-dns" containerID="cri-o://17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca" gracePeriod=10 Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.864933 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.916276 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerID="17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca" exitCode=0 Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.917171 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.917348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" event={"ID":"e3a1bc90-81db-4765-82e7-d74d47aeb02b","Type":"ContainerDied","Data":"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca"} Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.917459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" event={"ID":"e3a1bc90-81db-4765-82e7-d74d47aeb02b","Type":"ContainerDied","Data":"b88a2ae68a4a4f41d414b08614bc6dd13a4a67649e943512b9b0af66bcf216eb"} Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.917528 4764 scope.go:117] "RemoveContainer" containerID="17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca" Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.956029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.956215 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjmwd\" (UniqueName: \"kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.956674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.956799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.957107 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.957323 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb\") pod \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\" (UID: \"e3a1bc90-81db-4765-82e7-d74d47aeb02b\") " Dec 04 00:03:41 crc kubenswrapper[4764]: I1204 00:03:41.960021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd" (OuterVolumeSpecName: "kube-api-access-cjmwd") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "kube-api-access-cjmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.016397 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config" (OuterVolumeSpecName: "config") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.024214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.025860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.038123 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.039203 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3a1bc90-81db-4765-82e7-d74d47aeb02b" (UID: "e3a1bc90-81db-4765-82e7-d74d47aeb02b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060679 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060708 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060732 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjmwd\" (UniqueName: \"kubernetes.io/projected/e3a1bc90-81db-4765-82e7-d74d47aeb02b-kube-api-access-cjmwd\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060742 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060751 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.060761 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3a1bc90-81db-4765-82e7-d74d47aeb02b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.111495 4764 scope.go:117] "RemoveContainer" containerID="eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.136077 4764 scope.go:117] "RemoveContainer" containerID="17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca" Dec 04 00:03:42 crc kubenswrapper[4764]: E1204 00:03:42.137021 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca\": container with ID starting with 17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca not found: ID does not exist" containerID="17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.137061 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca"} err="failed to get container status \"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca\": rpc error: code = NotFound desc = could not find container \"17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca\": container with ID starting with 17b29af5f5deb0503b0085b7e6d44c4baba35a82fbf2f8934d170bbe670b4eca not found: ID does not exist" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.137105 4764 scope.go:117] "RemoveContainer" containerID="eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715" Dec 04 00:03:42 crc kubenswrapper[4764]: E1204 00:03:42.137481 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715\": container with ID starting with eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715 not found: ID does not exist" containerID="eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.137643 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715"} err="failed to get container status \"eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715\": rpc error: code = NotFound desc = could not find container \"eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715\": container with ID starting with eb64124870cbc5958e96e334fc12dc6c2d928f0d3dd25f985d53334345bea715 not found: ID does not exist" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.271075 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.279982 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd87576bf-b6krd"] Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.559127 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" path="/var/lib/kubelet/pods/e3a1bc90-81db-4765-82e7-d74d47aeb02b/volumes" Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.929294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerStarted","Data":"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7"} Dec 04 00:03:42 crc kubenswrapper[4764]: I1204 00:03:42.929487 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 00:03:45 crc kubenswrapper[4764]: I1204 00:03:45.989805 4764 generic.go:334] "Generic (PLEG): container finished" podID="d15434b3-f4f4-4e53-8ddd-db0df89aca8a" containerID="af3ad162775dbb086e4ee57f257df51f07acc80478f78e561812ab0e3ff974ec" exitCode=0 Dec 04 00:03:45 crc kubenswrapper[4764]: I1204 00:03:45.989890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4mnsn" event={"ID":"d15434b3-f4f4-4e53-8ddd-db0df89aca8a","Type":"ContainerDied","Data":"af3ad162775dbb086e4ee57f257df51f07acc80478f78e561812ab0e3ff974ec"} Dec 04 00:03:46 crc kubenswrapper[4764]: I1204 00:03:46.016260 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.014928674 podStartE2EDuration="9.016230514s" podCreationTimestamp="2025-12-04 00:03:37 +0000 UTC" firstStartedPulling="2025-12-04 00:03:38.858457113 +0000 UTC m=+1354.619781524" lastFinishedPulling="2025-12-04 00:03:41.859758953 +0000 UTC m=+1357.621083364" observedRunningTime="2025-12-04 00:03:42.960632764 +0000 UTC m=+1358.721957175" watchObservedRunningTime="2025-12-04 00:03:46.016230514 +0000 UTC m=+1361.777554975" Dec 04 00:03:46 crc kubenswrapper[4764]: I1204 00:03:46.676188 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bd87576bf-b6krd" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: i/o timeout" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.420178 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:03:47 crc kubenswrapper[4764]: E1204 00:03:47.420785 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="init" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.420806 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="init" Dec 04 00:03:47 crc kubenswrapper[4764]: E1204 00:03:47.420851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="dnsmasq-dns" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.420863 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="dnsmasq-dns" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.421235 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a1bc90-81db-4765-82e7-d74d47aeb02b" containerName="dnsmasq-dns" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.423407 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.429658 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.432393 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464199 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle\") pod \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464429 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data\") pod \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8brrv\" (UniqueName: \"kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv\") pod \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464551 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts\") pod \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\" (UID: \"d15434b3-f4f4-4e53-8ddd-db0df89aca8a\") " Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.464950 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.465111 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57n9\" (UniqueName: \"kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.475350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts" (OuterVolumeSpecName: "scripts") pod "d15434b3-f4f4-4e53-8ddd-db0df89aca8a" (UID: "d15434b3-f4f4-4e53-8ddd-db0df89aca8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.489510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv" (OuterVolumeSpecName: "kube-api-access-8brrv") pod "d15434b3-f4f4-4e53-8ddd-db0df89aca8a" (UID: "d15434b3-f4f4-4e53-8ddd-db0df89aca8a"). InnerVolumeSpecName "kube-api-access-8brrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.498336 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d15434b3-f4f4-4e53-8ddd-db0df89aca8a" (UID: "d15434b3-f4f4-4e53-8ddd-db0df89aca8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.507127 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data" (OuterVolumeSpecName: "config-data") pod "d15434b3-f4f4-4e53-8ddd-db0df89aca8a" (UID: "d15434b3-f4f4-4e53-8ddd-db0df89aca8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567338 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b57n9\" (UniqueName: \"kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567580 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8brrv\" (UniqueName: \"kubernetes.io/projected/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-kube-api-access-8brrv\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567597 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567607 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567620 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d15434b3-f4f4-4e53-8ddd-db0df89aca8a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.567669 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.568059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.584526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57n9\" (UniqueName: \"kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9\") pod \"redhat-operators-hzf8s\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:47 crc kubenswrapper[4764]: I1204 00:03:47.749702 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.016132 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4mnsn" event={"ID":"d15434b3-f4f4-4e53-8ddd-db0df89aca8a","Type":"ContainerDied","Data":"f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e"} Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.016198 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4mnsn" Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.016221 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1f1fb0e072a7b260c0f839b665c1bf3d4f16823cae348db203346434fd83b4e" Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.184926 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.225746 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.225979 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-log" containerID="cri-o://99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" gracePeriod=30 Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.226160 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-api" containerID="cri-o://f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" gracePeriod=30 Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.245881 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.246164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerName="nova-scheduler-scheduler" containerID="cri-o://f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" gracePeriod=30 Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.258272 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.258481 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" containerID="cri-o://16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43" gracePeriod=30 Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.258865 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" containerID="cri-o://b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2" gracePeriod=30 Dec 04 00:03:48 crc kubenswrapper[4764]: I1204 00:03:48.823954 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996361 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9r2j\" (UniqueName: \"kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996417 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996458 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996475 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:48.996495 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs\") pod \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\" (UID: \"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9\") " Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.004936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs" (OuterVolumeSpecName: "logs") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.008445 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j" (OuterVolumeSpecName: "kube-api-access-b9r2j") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "kube-api-access-b9r2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.033480 4764 generic.go:334] "Generic (PLEG): container finished" podID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerID="c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10" exitCode=0 Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.034370 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerDied","Data":"c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.034409 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerStarted","Data":"eb29401c9fb8b1af97fddf3fd54dc66154bbdb8a2209bdf88a2d83b6a1a177ce"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.037013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.042138 4764 generic.go:334] "Generic (PLEG): container finished" podID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerID="16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43" exitCode=143 Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.042198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerDied","Data":"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045059 4764 generic.go:334] "Generic (PLEG): container finished" podID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerID="f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" exitCode=0 Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045083 4764 generic.go:334] "Generic (PLEG): container finished" podID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerID="99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" exitCode=143 Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerDied","Data":"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045122 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerDied","Data":"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9","Type":"ContainerDied","Data":"489744868f9cf8cc833c6f6db95f4e28d2b503cec14f35fd87bb774c813b9b2b"} Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045151 4764 scope.go:117] "RemoveContainer" containerID="f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.045303 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.048694 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data" (OuterVolumeSpecName: "config-data") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.074916 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.079875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" (UID: "1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.079974 4764 scope.go:117] "RemoveContainer" containerID="99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.100106 4764 scope.go:117] "RemoveContainer" containerID="f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101287 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101313 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9r2j\" (UniqueName: \"kubernetes.io/projected/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-kube-api-access-b9r2j\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101327 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101338 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101348 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.101356 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.102115 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe\": container with ID starting with f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe not found: ID does not exist" containerID="f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102147 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe"} err="failed to get container status \"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe\": rpc error: code = NotFound desc = could not find container \"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe\": container with ID starting with f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe not found: ID does not exist" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102171 4764 scope.go:117] "RemoveContainer" containerID="99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.102442 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283\": container with ID starting with 99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283 not found: ID does not exist" containerID="99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102458 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283"} err="failed to get container status \"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283\": rpc error: code = NotFound desc = could not find container \"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283\": container with ID starting with 99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283 not found: ID does not exist" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102470 4764 scope.go:117] "RemoveContainer" containerID="f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102774 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe"} err="failed to get container status \"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe\": rpc error: code = NotFound desc = could not find container \"f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe\": container with ID starting with f89eefb7c4dd5269ae01e01bd1903f1aaa7e022bd41a26c847db6cdaca9468fe not found: ID does not exist" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.102792 4764 scope.go:117] "RemoveContainer" containerID="99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.103012 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283"} err="failed to get container status \"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283\": rpc error: code = NotFound desc = could not find container \"99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283\": container with ID starting with 99efedc534c8f7d6dbef86627e2cb73793aa20af07c5bd341cb1a18351834283 not found: ID does not exist" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.429026 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.437427 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.450665 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.451254 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-log" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.451434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-log" Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.451549 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-api" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.451732 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-api" Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.451801 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d15434b3-f4f4-4e53-8ddd-db0df89aca8a" containerName="nova-manage" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.451868 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d15434b3-f4f4-4e53-8ddd-db0df89aca8a" containerName="nova-manage" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.452155 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-log" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.452263 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d15434b3-f4f4-4e53-8ddd-db0df89aca8a" containerName="nova-manage" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.452350 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" containerName="nova-api-api" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.453565 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.456922 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.457299 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.462833 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.464259 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610068 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610136 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.610410 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rqk\" (UniqueName: \"kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.712353 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.712795 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.712963 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.713117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rqk\" (UniqueName: \"kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.713199 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.713257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.713614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.716103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.716403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.716688 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.717022 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.737754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rqk\" (UniqueName: \"kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk\") pod \"nova-api-0\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: I1204 00:03:49.774166 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.854559 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.857688 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.861044 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 00:03:49 crc kubenswrapper[4764]: E1204 00:03:49.861091 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerName="nova-scheduler-scheduler" Dec 04 00:03:50 crc kubenswrapper[4764]: I1204 00:03:50.057961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerStarted","Data":"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5"} Dec 04 00:03:50 crc kubenswrapper[4764]: I1204 00:03:50.217528 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:03:50 crc kubenswrapper[4764]: I1204 00:03:50.559587 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9" path="/var/lib/kubelet/pods/1179e07c-0222-4ae3-9bd2-81e2f5a6a6b9/volumes" Dec 04 00:03:51 crc kubenswrapper[4764]: I1204 00:03:51.085384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerStarted","Data":"32bfd6d3548d7ed75468d396b935e173001398b11d776ac46fbc7e65ee2ad928"} Dec 04 00:03:51 crc kubenswrapper[4764]: I1204 00:03:51.085432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerStarted","Data":"7d9ad52eccd57f3dc2c2b526078b1c63f8db3caa34ad8712f276ff67f5b60e3c"} Dec 04 00:03:51 crc kubenswrapper[4764]: I1204 00:03:51.410412 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47364->10.217.0.191:8775: read: connection reset by peer" Dec 04 00:03:51 crc kubenswrapper[4764]: I1204 00:03:51.410590 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47368->10.217.0.191:8775: read: connection reset by peer" Dec 04 00:03:51 crc kubenswrapper[4764]: I1204 00:03:51.864622 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.056703 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle\") pod \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.056841 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs\") pod \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.056971 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcnbw\" (UniqueName: \"kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw\") pod \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.057038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs\") pod \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.057176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data\") pod \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\" (UID: \"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94\") " Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.057886 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs" (OuterVolumeSpecName: "logs") pod "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" (UID: "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.061567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw" (OuterVolumeSpecName: "kube-api-access-qcnbw") pod "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" (UID: "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94"). InnerVolumeSpecName "kube-api-access-qcnbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.085347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" (UID: "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.087845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data" (OuterVolumeSpecName: "config-data") pod "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" (UID: "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.094708 4764 generic.go:334] "Generic (PLEG): container finished" podID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerID="9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5" exitCode=0 Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.094749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerDied","Data":"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5"} Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.096936 4764 generic.go:334] "Generic (PLEG): container finished" podID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerID="b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2" exitCode=0 Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.097048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerDied","Data":"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2"} Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.097093 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"982c595a-a3c1-4f3b-aca5-6e6e6cd52e94","Type":"ContainerDied","Data":"978060570c2e58cbad751dd4228a841bc1a84b6e0ec01b855e7576e4d353d667"} Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.097115 4764 scope.go:117] "RemoveContainer" containerID="b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.099645 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.100405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerStarted","Data":"bfa789db8eedd550d660743c753b3bea2fab2bce89eb7947314062414fa5026a"} Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.118569 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" (UID: "982c595a-a3c1-4f3b-aca5-6e6e6cd52e94"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.159602 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.159633 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.159643 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.159652 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcnbw\" (UniqueName: \"kubernetes.io/projected/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-kube-api-access-qcnbw\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.159662 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.175632 4764 scope.go:117] "RemoveContainer" containerID="16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.193933 4764 scope.go:117] "RemoveContainer" containerID="b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2" Dec 04 00:03:52 crc kubenswrapper[4764]: E1204 00:03:52.194517 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2\": container with ID starting with b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2 not found: ID does not exist" containerID="b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.194562 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2"} err="failed to get container status \"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2\": rpc error: code = NotFound desc = could not find container \"b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2\": container with ID starting with b3b20e410a9be9119df11f8ce8bf4cc95b11daebb5f3aa7cb5fe8900332029a2 not found: ID does not exist" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.194587 4764 scope.go:117] "RemoveContainer" containerID="16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43" Dec 04 00:03:52 crc kubenswrapper[4764]: E1204 00:03:52.195005 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43\": container with ID starting with 16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43 not found: ID does not exist" containerID="16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.195045 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43"} err="failed to get container status \"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43\": rpc error: code = NotFound desc = could not find container \"16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43\": container with ID starting with 16375c0eefbffa2ae41885957e230c4fe182ff5595e82e118ff168abe17a0e43 not found: ID does not exist" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.430747 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.430730477 podStartE2EDuration="3.430730477s" podCreationTimestamp="2025-12-04 00:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:52.1490446 +0000 UTC m=+1367.910369031" watchObservedRunningTime="2025-12-04 00:03:52.430730477 +0000 UTC m=+1368.192054898" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.437371 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.444812 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.456160 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:52 crc kubenswrapper[4764]: E1204 00:03:52.456526 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.456544 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" Dec 04 00:03:52 crc kubenswrapper[4764]: E1204 00:03:52.456569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.456575 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.456778 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-metadata" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.456799 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" containerName="nova-metadata-log" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.457883 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.462205 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.462425 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.469179 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.563283 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982c595a-a3c1-4f3b-aca5-6e6e6cd52e94" path="/var/lib/kubelet/pods/982c595a-a3c1-4f3b-aca5-6e6e6cd52e94/volumes" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.566374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nn4h\" (UniqueName: \"kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.566421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.566442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.566628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.566739 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.668954 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nn4h\" (UniqueName: \"kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.669019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.669049 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.669191 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.669249 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.670070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.674606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.675589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.683318 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.696477 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nn4h\" (UniqueName: \"kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h\") pod \"nova-metadata-0\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " pod="openstack/nova-metadata-0" Dec 04 00:03:52 crc kubenswrapper[4764]: I1204 00:03:52.776539 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:03:53 crc kubenswrapper[4764]: I1204 00:03:53.316122 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:03:53 crc kubenswrapper[4764]: W1204 00:03:53.330135 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9218a48_75e0_47ae_a2ac_2d2fa4d08971.slice/crio-ea95e4f5c2b03dfe256bd4671694570f8edb1d8654aac8a540b13306cd682257 WatchSource:0}: Error finding container ea95e4f5c2b03dfe256bd4671694570f8edb1d8654aac8a540b13306cd682257: Status 404 returned error can't find the container with id ea95e4f5c2b03dfe256bd4671694570f8edb1d8654aac8a540b13306cd682257 Dec 04 00:03:53 crc kubenswrapper[4764]: I1204 00:03:53.957045 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.092781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data\") pod \"7dcd93b6-05f9-4027-ab36-b33da560ef64\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.093102 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle\") pod \"7dcd93b6-05f9-4027-ab36-b33da560ef64\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.093169 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whpjb\" (UniqueName: \"kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb\") pod \"7dcd93b6-05f9-4027-ab36-b33da560ef64\" (UID: \"7dcd93b6-05f9-4027-ab36-b33da560ef64\") " Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.097271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb" (OuterVolumeSpecName: "kube-api-access-whpjb") pod "7dcd93b6-05f9-4027-ab36-b33da560ef64" (UID: "7dcd93b6-05f9-4027-ab36-b33da560ef64"). InnerVolumeSpecName "kube-api-access-whpjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.121802 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dcd93b6-05f9-4027-ab36-b33da560ef64" (UID: "7dcd93b6-05f9-4027-ab36-b33da560ef64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.125142 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerStarted","Data":"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.126282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data" (OuterVolumeSpecName: "config-data") pod "7dcd93b6-05f9-4027-ab36-b33da560ef64" (UID: "7dcd93b6-05f9-4027-ab36-b33da560ef64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.127048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerStarted","Data":"6f3ad0a68a4b98fc593b43b878fbce89575024644243ce672d381c81a0dabf6a"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.127108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerStarted","Data":"506954a3df1b596b6cc009eafe6c0378475f7b74778bfc77c7cd68e0dfd9aa9d"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.127133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerStarted","Data":"ea95e4f5c2b03dfe256bd4671694570f8edb1d8654aac8a540b13306cd682257"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.131871 4764 generic.go:334] "Generic (PLEG): container finished" podID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" exitCode=0 Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.131910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dcd93b6-05f9-4027-ab36-b33da560ef64","Type":"ContainerDied","Data":"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.131931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dcd93b6-05f9-4027-ab36-b33da560ef64","Type":"ContainerDied","Data":"9b7b63bcea5389e4c9fc4a702e9dac1631f26cea7fb0ea8d1c75aa12ddcf7fa9"} Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.131947 4764 scope.go:117] "RemoveContainer" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.132015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.155334 4764 scope.go:117] "RemoveContainer" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" Dec 04 00:03:54 crc kubenswrapper[4764]: E1204 00:03:54.155663 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31\": container with ID starting with f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31 not found: ID does not exist" containerID="f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.155689 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31"} err="failed to get container status \"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31\": rpc error: code = NotFound desc = could not find container \"f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31\": container with ID starting with f323519e790c070936bdfc69cbcff97526bd59c662bf19a559248794a8dcdc31 not found: ID does not exist" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.179497 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hzf8s" podStartSLOduration=2.995166665 podStartE2EDuration="7.179479685s" podCreationTimestamp="2025-12-04 00:03:47 +0000 UTC" firstStartedPulling="2025-12-04 00:03:49.037766707 +0000 UTC m=+1364.799091118" lastFinishedPulling="2025-12-04 00:03:53.222079737 +0000 UTC m=+1368.983404138" observedRunningTime="2025-12-04 00:03:54.15407184 +0000 UTC m=+1369.915396261" watchObservedRunningTime="2025-12-04 00:03:54.179479685 +0000 UTC m=+1369.940804096" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.184965 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.18494874 podStartE2EDuration="2.18494874s" podCreationTimestamp="2025-12-04 00:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:54.171738955 +0000 UTC m=+1369.933063366" watchObservedRunningTime="2025-12-04 00:03:54.18494874 +0000 UTC m=+1369.946273151" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.195013 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.195046 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whpjb\" (UniqueName: \"kubernetes.io/projected/7dcd93b6-05f9-4027-ab36-b33da560ef64-kube-api-access-whpjb\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.195058 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcd93b6-05f9-4027-ab36-b33da560ef64-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.203895 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.230335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.237654 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:54 crc kubenswrapper[4764]: E1204 00:03:54.238187 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerName="nova-scheduler-scheduler" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.238210 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerName="nova-scheduler-scheduler" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.238469 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" containerName="nova-scheduler-scheduler" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.239278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.243418 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.247220 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.398411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzv2n\" (UniqueName: \"kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.398528 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.398561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.499821 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzv2n\" (UniqueName: \"kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.499912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.499936 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.504810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.505276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.527621 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzv2n\" (UniqueName: \"kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n\") pod \"nova-scheduler-0\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.557042 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:03:54 crc kubenswrapper[4764]: I1204 00:03:54.562232 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcd93b6-05f9-4027-ab36-b33da560ef64" path="/var/lib/kubelet/pods/7dcd93b6-05f9-4027-ab36-b33da560ef64/volumes" Dec 04 00:03:55 crc kubenswrapper[4764]: W1204 00:03:55.044938 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0baff485_3721_45b5_9177_96c30ce03251.slice/crio-c980cd93b501010ed400c99974b61793e3928cbcc6ffaebbe39267796eced2ae WatchSource:0}: Error finding container c980cd93b501010ed400c99974b61793e3928cbcc6ffaebbe39267796eced2ae: Status 404 returned error can't find the container with id c980cd93b501010ed400c99974b61793e3928cbcc6ffaebbe39267796eced2ae Dec 04 00:03:55 crc kubenswrapper[4764]: I1204 00:03:55.050367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:03:55 crc kubenswrapper[4764]: I1204 00:03:55.149454 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0baff485-3721-45b5-9177-96c30ce03251","Type":"ContainerStarted","Data":"c980cd93b501010ed400c99974b61793e3928cbcc6ffaebbe39267796eced2ae"} Dec 04 00:03:56 crc kubenswrapper[4764]: I1204 00:03:56.163968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0baff485-3721-45b5-9177-96c30ce03251","Type":"ContainerStarted","Data":"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab"} Dec 04 00:03:56 crc kubenswrapper[4764]: I1204 00:03:56.191664 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.191642839 podStartE2EDuration="2.191642839s" podCreationTimestamp="2025-12-04 00:03:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:03:56.180597387 +0000 UTC m=+1371.941921798" watchObservedRunningTime="2025-12-04 00:03:56.191642839 +0000 UTC m=+1371.952967250" Dec 04 00:03:57 crc kubenswrapper[4764]: I1204 00:03:57.750497 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:57 crc kubenswrapper[4764]: I1204 00:03:57.750862 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:03:57 crc kubenswrapper[4764]: I1204 00:03:57.777845 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 00:03:57 crc kubenswrapper[4764]: I1204 00:03:57.777892 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 00:03:58 crc kubenswrapper[4764]: I1204 00:03:58.797913 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hzf8s" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="registry-server" probeResult="failure" output=< Dec 04 00:03:58 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 00:03:58 crc kubenswrapper[4764]: > Dec 04 00:03:59 crc kubenswrapper[4764]: I1204 00:03:59.557404 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 00:03:59 crc kubenswrapper[4764]: I1204 00:03:59.777343 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:03:59 crc kubenswrapper[4764]: I1204 00:03:59.777431 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 00:04:00 crc kubenswrapper[4764]: I1204 00:04:00.790197 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 00:04:00 crc kubenswrapper[4764]: I1204 00:04:00.790230 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 00:04:02 crc kubenswrapper[4764]: I1204 00:04:02.778489 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 00:04:02 crc kubenswrapper[4764]: I1204 00:04:02.779747 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 00:04:03 crc kubenswrapper[4764]: I1204 00:04:03.793878 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 00:04:03 crc kubenswrapper[4764]: I1204 00:04:03.793959 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 00:04:04 crc kubenswrapper[4764]: I1204 00:04:04.576563 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 00:04:04 crc kubenswrapper[4764]: I1204 00:04:04.601020 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.290648 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.533690 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.536587 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.553731 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.735171 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmf6s\" (UniqueName: \"kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.735249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.735272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.740966 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.767359 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.776056 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.837180 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmf6s\" (UniqueName: \"kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.837239 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.837262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.837735 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.837800 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.866772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmf6s\" (UniqueName: \"kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s\") pod \"redhat-marketplace-gzjgz\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.939181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkxn\" (UniqueName: \"kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.939289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:05 crc kubenswrapper[4764]: I1204 00:04:05.939364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.040692 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkxn\" (UniqueName: \"kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.040794 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.040857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.041380 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.041942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.061529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkxn\" (UniqueName: \"kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn\") pod \"certified-operators-ngb25\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.099224 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.155137 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.573483 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:06 crc kubenswrapper[4764]: I1204 00:04:06.909775 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.305956 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerID="244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc" exitCode=0 Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.306041 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerDied","Data":"244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc"} Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.306135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerStarted","Data":"f15287de1f990fae9821bae0f7ccaff2fcdb6190b0fc7d7226366aa9faf2ad63"} Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.308701 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.309951 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerID="49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff" exitCode=0 Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.309992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerDied","Data":"49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff"} Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.310023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerStarted","Data":"fa148d66babb15b5b9af3108197565e0f3eb3f327746eb19b93645cf004dfe0f"} Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.819366 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:04:07 crc kubenswrapper[4764]: I1204 00:04:07.882333 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:04:08 crc kubenswrapper[4764]: I1204 00:04:08.333905 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 00:04:08 crc kubenswrapper[4764]: I1204 00:04:08.935687 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:08 crc kubenswrapper[4764]: I1204 00:04:08.938215 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:08 crc kubenswrapper[4764]: I1204 00:04:08.946035 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.104794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.104888 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrw5s\" (UniqueName: \"kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.104983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.206346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.206451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrw5s\" (UniqueName: \"kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.206527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.207283 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.207334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.229057 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrw5s\" (UniqueName: \"kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s\") pod \"community-operators-g4pnw\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.285808 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.390665 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerStarted","Data":"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37"} Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.406459 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerID="71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd" exitCode=0 Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.406512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerDied","Data":"71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd"} Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.780949 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.781840 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.785872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.788162 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 00:04:09 crc kubenswrapper[4764]: I1204 00:04:09.896594 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:10 crc kubenswrapper[4764]: I1204 00:04:10.430382 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerID="644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37" exitCode=0 Dec 04 00:04:10 crc kubenswrapper[4764]: I1204 00:04:10.430484 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerDied","Data":"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37"} Dec 04 00:04:10 crc kubenswrapper[4764]: I1204 00:04:10.432742 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerStarted","Data":"6b7c1340bdec4437e525238c88bb7592f7714cd9c85155dc02280926ec3018f8"} Dec 04 00:04:10 crc kubenswrapper[4764]: I1204 00:04:10.433074 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 00:04:10 crc kubenswrapper[4764]: I1204 00:04:10.446313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.444978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerStarted","Data":"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70"} Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.446734 4764 generic.go:334] "Generic (PLEG): container finished" podID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerID="a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b" exitCode=0 Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.446812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerDied","Data":"a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b"} Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.449826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerStarted","Data":"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1"} Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.477481 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngb25" podStartSLOduration=2.971646243 podStartE2EDuration="6.477454023s" podCreationTimestamp="2025-12-04 00:04:05 +0000 UTC" firstStartedPulling="2025-12-04 00:04:07.311522575 +0000 UTC m=+1383.072847006" lastFinishedPulling="2025-12-04 00:04:10.817330375 +0000 UTC m=+1386.578654786" observedRunningTime="2025-12-04 00:04:11.473152367 +0000 UTC m=+1387.234476778" watchObservedRunningTime="2025-12-04 00:04:11.477454023 +0000 UTC m=+1387.238778434" Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.530522 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzjgz" podStartSLOduration=3.882672809 podStartE2EDuration="6.530470428s" podCreationTimestamp="2025-12-04 00:04:05 +0000 UTC" firstStartedPulling="2025-12-04 00:04:07.308449079 +0000 UTC m=+1383.069773500" lastFinishedPulling="2025-12-04 00:04:09.956246708 +0000 UTC m=+1385.717571119" observedRunningTime="2025-12-04 00:04:11.529614397 +0000 UTC m=+1387.290938808" watchObservedRunningTime="2025-12-04 00:04:11.530470428 +0000 UTC m=+1387.291794839" Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.928180 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:04:11 crc kubenswrapper[4764]: I1204 00:04:11.928420 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hzf8s" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="registry-server" containerID="cri-o://857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc" gracePeriod=2 Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.434021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.460366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerStarted","Data":"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc"} Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.474485 4764 generic.go:334] "Generic (PLEG): container finished" podID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerID="857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc" exitCode=0 Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.474782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerDied","Data":"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc"} Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.474828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hzf8s" event={"ID":"717de319-f1ef-4dd5-8e96-3b7087f6a83d","Type":"ContainerDied","Data":"eb29401c9fb8b1af97fddf3fd54dc66154bbdb8a2209bdf88a2d83b6a1a177ce"} Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.474847 4764 scope.go:117] "RemoveContainer" containerID="857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.475015 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hzf8s" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.496904 4764 scope.go:117] "RemoveContainer" containerID="9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.520169 4764 scope.go:117] "RemoveContainer" containerID="c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554074 4764 scope.go:117] "RemoveContainer" containerID="857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc" Dec 04 00:04:12 crc kubenswrapper[4764]: E1204 00:04:12.554376 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc\": container with ID starting with 857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc not found: ID does not exist" containerID="857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554407 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc"} err="failed to get container status \"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc\": rpc error: code = NotFound desc = could not find container \"857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc\": container with ID starting with 857680400cdacf7f26c45d04c968f98f000d039498ce9803b09c7929d775aebc not found: ID does not exist" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554429 4764 scope.go:117] "RemoveContainer" containerID="9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5" Dec 04 00:04:12 crc kubenswrapper[4764]: E1204 00:04:12.554657 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5\": container with ID starting with 9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5 not found: ID does not exist" containerID="9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554677 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5"} err="failed to get container status \"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5\": rpc error: code = NotFound desc = could not find container \"9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5\": container with ID starting with 9bd7eb8dbe5de7c17d702e2a258cf4bd81221ab4dbb38e93f267a98c897ba5a5 not found: ID does not exist" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554690 4764 scope.go:117] "RemoveContainer" containerID="c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10" Dec 04 00:04:12 crc kubenswrapper[4764]: E1204 00:04:12.554882 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10\": container with ID starting with c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10 not found: ID does not exist" containerID="c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.554903 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10"} err="failed to get container status \"c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10\": rpc error: code = NotFound desc = could not find container \"c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10\": container with ID starting with c7012526565855b7c6b3e35641d821df5010959defff2e15ce9d30abafa02a10 not found: ID does not exist" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.570770 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b57n9\" (UniqueName: \"kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9\") pod \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.570853 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities\") pod \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.570948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content\") pod \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\" (UID: \"717de319-f1ef-4dd5-8e96-3b7087f6a83d\") " Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.572619 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities" (OuterVolumeSpecName: "utilities") pod "717de319-f1ef-4dd5-8e96-3b7087f6a83d" (UID: "717de319-f1ef-4dd5-8e96-3b7087f6a83d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.576510 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9" (OuterVolumeSpecName: "kube-api-access-b57n9") pod "717de319-f1ef-4dd5-8e96-3b7087f6a83d" (UID: "717de319-f1ef-4dd5-8e96-3b7087f6a83d"). InnerVolumeSpecName "kube-api-access-b57n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.669177 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "717de319-f1ef-4dd5-8e96-3b7087f6a83d" (UID: "717de319-f1ef-4dd5-8e96-3b7087f6a83d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.673549 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.673705 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b57n9\" (UniqueName: \"kubernetes.io/projected/717de319-f1ef-4dd5-8e96-3b7087f6a83d-kube-api-access-b57n9\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.673845 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/717de319-f1ef-4dd5-8e96-3b7087f6a83d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.785746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.786424 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.794861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.837365 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:04:12 crc kubenswrapper[4764]: I1204 00:04:12.852276 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hzf8s"] Dec 04 00:04:13 crc kubenswrapper[4764]: I1204 00:04:13.487734 4764 generic.go:334] "Generic (PLEG): container finished" podID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerID="51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc" exitCode=0 Dec 04 00:04:13 crc kubenswrapper[4764]: I1204 00:04:13.487791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerDied","Data":"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc"} Dec 04 00:04:13 crc kubenswrapper[4764]: I1204 00:04:13.501839 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 00:04:14 crc kubenswrapper[4764]: I1204 00:04:14.508443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerStarted","Data":"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f"} Dec 04 00:04:14 crc kubenswrapper[4764]: I1204 00:04:14.534552 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g4pnw" podStartSLOduration=4.03449598 podStartE2EDuration="6.53453481s" podCreationTimestamp="2025-12-04 00:04:08 +0000 UTC" firstStartedPulling="2025-12-04 00:04:11.448008287 +0000 UTC m=+1387.209332698" lastFinishedPulling="2025-12-04 00:04:13.948047117 +0000 UTC m=+1389.709371528" observedRunningTime="2025-12-04 00:04:14.529385154 +0000 UTC m=+1390.290709565" watchObservedRunningTime="2025-12-04 00:04:14.53453481 +0000 UTC m=+1390.295859221" Dec 04 00:04:14 crc kubenswrapper[4764]: I1204 00:04:14.588176 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" path="/var/lib/kubelet/pods/717de319-f1ef-4dd5-8e96-3b7087f6a83d/volumes" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.099358 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.099701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.155882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.156563 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.162073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.207831 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.582802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:16 crc kubenswrapper[4764]: I1204 00:04:16.592932 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:18 crc kubenswrapper[4764]: I1204 00:04:18.524127 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:18 crc kubenswrapper[4764]: I1204 00:04:18.546424 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngb25" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="registry-server" containerID="cri-o://0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70" gracePeriod=2 Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.036315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.131181 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.190974 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content\") pod \"fc8a5462-d9d8-442b-aa56-389f2706fef0\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.191025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hkxn\" (UniqueName: \"kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn\") pod \"fc8a5462-d9d8-442b-aa56-389f2706fef0\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.191110 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities\") pod \"fc8a5462-d9d8-442b-aa56-389f2706fef0\" (UID: \"fc8a5462-d9d8-442b-aa56-389f2706fef0\") " Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.192333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities" (OuterVolumeSpecName: "utilities") pod "fc8a5462-d9d8-442b-aa56-389f2706fef0" (UID: "fc8a5462-d9d8-442b-aa56-389f2706fef0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.199456 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn" (OuterVolumeSpecName: "kube-api-access-4hkxn") pod "fc8a5462-d9d8-442b-aa56-389f2706fef0" (UID: "fc8a5462-d9d8-442b-aa56-389f2706fef0"). InnerVolumeSpecName "kube-api-access-4hkxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.243933 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc8a5462-d9d8-442b-aa56-389f2706fef0" (UID: "fc8a5462-d9d8-442b-aa56-389f2706fef0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.286447 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.286752 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.292833 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.292878 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hkxn\" (UniqueName: \"kubernetes.io/projected/fc8a5462-d9d8-442b-aa56-389f2706fef0-kube-api-access-4hkxn\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.292893 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc8a5462-d9d8-442b-aa56-389f2706fef0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.353254 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.560085 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerID="0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70" exitCode=0 Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.560198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerDied","Data":"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70"} Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.560302 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngb25" event={"ID":"fc8a5462-d9d8-442b-aa56-389f2706fef0","Type":"ContainerDied","Data":"fa148d66babb15b5b9af3108197565e0f3eb3f327746eb19b93645cf004dfe0f"} Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.560237 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngb25" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.560373 4764 scope.go:117] "RemoveContainer" containerID="0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.561169 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzjgz" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="registry-server" containerID="cri-o://d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1" gracePeriod=2 Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.592583 4764 scope.go:117] "RemoveContainer" containerID="644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.624149 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.631300 4764 scope.go:117] "RemoveContainer" containerID="49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.633749 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngb25"] Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.639783 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.724046 4764 scope.go:117] "RemoveContainer" containerID="0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70" Dec 04 00:04:19 crc kubenswrapper[4764]: E1204 00:04:19.724601 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70\": container with ID starting with 0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70 not found: ID does not exist" containerID="0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.724649 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70"} err="failed to get container status \"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70\": rpc error: code = NotFound desc = could not find container \"0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70\": container with ID starting with 0d115b1d08967ebf55f5bbbb1404ef5a90079d384ecb81d2e50d5c4c06ae6c70 not found: ID does not exist" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.724680 4764 scope.go:117] "RemoveContainer" containerID="644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37" Dec 04 00:04:19 crc kubenswrapper[4764]: E1204 00:04:19.725223 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37\": container with ID starting with 644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37 not found: ID does not exist" containerID="644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.725251 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37"} err="failed to get container status \"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37\": rpc error: code = NotFound desc = could not find container \"644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37\": container with ID starting with 644a702bdba180618ff0e7b649944db7ab7b6a5266dfa30d718bead0b41d3d37 not found: ID does not exist" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.725270 4764 scope.go:117] "RemoveContainer" containerID="49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff" Dec 04 00:04:19 crc kubenswrapper[4764]: E1204 00:04:19.725591 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff\": container with ID starting with 49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff not found: ID does not exist" containerID="49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff" Dec 04 00:04:19 crc kubenswrapper[4764]: I1204 00:04:19.725619 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff"} err="failed to get container status \"49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff\": rpc error: code = NotFound desc = could not find container \"49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff\": container with ID starting with 49cec00011873ac31c9aefda55a09161c423265dc7c24c2fa18eb1336632ecff not found: ID does not exist" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.054951 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.212831 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content\") pod \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.213178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmf6s\" (UniqueName: \"kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s\") pod \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.213348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities\") pod \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\" (UID: \"7ee6bb73-8de2-47ac-a53c-d61f41701f9b\") " Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.213893 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities" (OuterVolumeSpecName: "utilities") pod "7ee6bb73-8de2-47ac-a53c-d61f41701f9b" (UID: "7ee6bb73-8de2-47ac-a53c-d61f41701f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.218233 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s" (OuterVolumeSpecName: "kube-api-access-bmf6s") pod "7ee6bb73-8de2-47ac-a53c-d61f41701f9b" (UID: "7ee6bb73-8de2-47ac-a53c-d61f41701f9b"). InnerVolumeSpecName "kube-api-access-bmf6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.252542 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7ee6bb73-8de2-47ac-a53c-d61f41701f9b" (UID: "7ee6bb73-8de2-47ac-a53c-d61f41701f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.316575 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.316640 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmf6s\" (UniqueName: \"kubernetes.io/projected/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-kube-api-access-bmf6s\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.316671 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ee6bb73-8de2-47ac-a53c-d61f41701f9b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.563197 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" path="/var/lib/kubelet/pods/fc8a5462-d9d8-442b-aa56-389f2706fef0/volumes" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.573320 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerID="d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1" exitCode=0 Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.573390 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerDied","Data":"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1"} Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.573427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzjgz" event={"ID":"7ee6bb73-8de2-47ac-a53c-d61f41701f9b","Type":"ContainerDied","Data":"f15287de1f990fae9821bae0f7ccaff2fcdb6190b0fc7d7226366aa9faf2ad63"} Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.573457 4764 scope.go:117] "RemoveContainer" containerID="d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.573597 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzjgz" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.602402 4764 scope.go:117] "RemoveContainer" containerID="71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.605993 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.621310 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzjgz"] Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.625762 4764 scope.go:117] "RemoveContainer" containerID="244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.656191 4764 scope.go:117] "RemoveContainer" containerID="d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1" Dec 04 00:04:20 crc kubenswrapper[4764]: E1204 00:04:20.656854 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1\": container with ID starting with d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1 not found: ID does not exist" containerID="d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.656947 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1"} err="failed to get container status \"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1\": rpc error: code = NotFound desc = could not find container \"d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1\": container with ID starting with d1f728af5e2a66b607258383de904b5e8121e4fad83a26db3cb89d0d11a0dfa1 not found: ID does not exist" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.656986 4764 scope.go:117] "RemoveContainer" containerID="71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd" Dec 04 00:04:20 crc kubenswrapper[4764]: E1204 00:04:20.657464 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd\": container with ID starting with 71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd not found: ID does not exist" containerID="71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.657502 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd"} err="failed to get container status \"71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd\": rpc error: code = NotFound desc = could not find container \"71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd\": container with ID starting with 71c681c21666ced10f9a12858665404eb990b9065f5a51812a8ca9ddb697f3fd not found: ID does not exist" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.657531 4764 scope.go:117] "RemoveContainer" containerID="244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc" Dec 04 00:04:20 crc kubenswrapper[4764]: E1204 00:04:20.657788 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc\": container with ID starting with 244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc not found: ID does not exist" containerID="244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc" Dec 04 00:04:20 crc kubenswrapper[4764]: I1204 00:04:20.657895 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc"} err="failed to get container status \"244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc\": rpc error: code = NotFound desc = could not find container \"244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc\": container with ID starting with 244426f0668055d916a6cb2234c9b783f7eb10efaab56320fa377455e58caabc not found: ID does not exist" Dec 04 00:04:22 crc kubenswrapper[4764]: I1204 00:04:22.560757 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" path="/var/lib/kubelet/pods/7ee6bb73-8de2-47ac-a53c-d61f41701f9b/volumes" Dec 04 00:04:23 crc kubenswrapper[4764]: I1204 00:04:23.525886 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:23 crc kubenswrapper[4764]: I1204 00:04:23.526112 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g4pnw" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="registry-server" containerID="cri-o://ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f" gracePeriod=2 Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.003880 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.196155 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content\") pod \"c545243b-63f7-4b3b-a18b-f7053cb83d93\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.196515 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrw5s\" (UniqueName: \"kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s\") pod \"c545243b-63f7-4b3b-a18b-f7053cb83d93\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.196634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities\") pod \"c545243b-63f7-4b3b-a18b-f7053cb83d93\" (UID: \"c545243b-63f7-4b3b-a18b-f7053cb83d93\") " Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.197986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities" (OuterVolumeSpecName: "utilities") pod "c545243b-63f7-4b3b-a18b-f7053cb83d93" (UID: "c545243b-63f7-4b3b-a18b-f7053cb83d93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.204818 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s" (OuterVolumeSpecName: "kube-api-access-rrw5s") pod "c545243b-63f7-4b3b-a18b-f7053cb83d93" (UID: "c545243b-63f7-4b3b-a18b-f7053cb83d93"). InnerVolumeSpecName "kube-api-access-rrw5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.249904 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c545243b-63f7-4b3b-a18b-f7053cb83d93" (UID: "c545243b-63f7-4b3b-a18b-f7053cb83d93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.299230 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.299493 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrw5s\" (UniqueName: \"kubernetes.io/projected/c545243b-63f7-4b3b-a18b-f7053cb83d93-kube-api-access-rrw5s\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.299592 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c545243b-63f7-4b3b-a18b-f7053cb83d93-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.627834 4764 generic.go:334] "Generic (PLEG): container finished" podID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerID="ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f" exitCode=0 Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.627883 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerDied","Data":"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f"} Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.627922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g4pnw" event={"ID":"c545243b-63f7-4b3b-a18b-f7053cb83d93","Type":"ContainerDied","Data":"6b7c1340bdec4437e525238c88bb7592f7714cd9c85155dc02280926ec3018f8"} Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.627944 4764 scope.go:117] "RemoveContainer" containerID="ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.628028 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g4pnw" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.656079 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.660352 4764 scope.go:117] "RemoveContainer" containerID="51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.664858 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g4pnw"] Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.699584 4764 scope.go:117] "RemoveContainer" containerID="a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.750750 4764 scope.go:117] "RemoveContainer" containerID="ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f" Dec 04 00:04:24 crc kubenswrapper[4764]: E1204 00:04:24.751192 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f\": container with ID starting with ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f not found: ID does not exist" containerID="ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.751226 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f"} err="failed to get container status \"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f\": rpc error: code = NotFound desc = could not find container \"ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f\": container with ID starting with ff44a5675142e3a8ccb4b8e31ed88cbd0d4347a6fe9bbaea7d57da45259d492f not found: ID does not exist" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.751257 4764 scope.go:117] "RemoveContainer" containerID="51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc" Dec 04 00:04:24 crc kubenswrapper[4764]: E1204 00:04:24.751570 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc\": container with ID starting with 51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc not found: ID does not exist" containerID="51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.751595 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc"} err="failed to get container status \"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc\": rpc error: code = NotFound desc = could not find container \"51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc\": container with ID starting with 51220f9979acfbff4d111482513a266e02cdf44c0f1ac986290913ffe157bfcc not found: ID does not exist" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.751607 4764 scope.go:117] "RemoveContainer" containerID="a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b" Dec 04 00:04:24 crc kubenswrapper[4764]: E1204 00:04:24.751917 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b\": container with ID starting with a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b not found: ID does not exist" containerID="a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b" Dec 04 00:04:24 crc kubenswrapper[4764]: I1204 00:04:24.751958 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b"} err="failed to get container status \"a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b\": rpc error: code = NotFound desc = could not find container \"a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b\": container with ID starting with a8f5d6abff0d4da82feeb8ee393ee64cc48aedf633bdaaf34a9f24ec5437203b not found: ID does not exist" Dec 04 00:04:26 crc kubenswrapper[4764]: I1204 00:04:26.561477 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" path="/var/lib/kubelet/pods/c545243b-63f7-4b3b-a18b-f7053cb83d93/volumes" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.119357 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.120090 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" containerName="openstackclient" containerID="cri-o://563686178a29a5fa1144486490dc7517a511b76ab0bc8be6b06e40785fc8ba8a" gracePeriod=2 Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.206775 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.289560 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.432779 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.432839 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data podName:bda43f61-31ae-4c4c-967e-f0e8d13f5ae9 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:34.932823106 +0000 UTC m=+1410.694147517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data") pod "rabbitmq-cell1-server-0" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9") : configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467203 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467585 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467602 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467619 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" containerName="openstackclient" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467626 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" containerName="openstackclient" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467634 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467640 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467649 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467656 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467665 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467670 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467681 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467687 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467696 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467711 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467737 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467753 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467759 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467768 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467775 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467787 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467792 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="extract-utilities" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467810 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467817 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.467831 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.467839 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="extract-content" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468003 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a5462-d9d8-442b-aa56-389f2706fef0" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468018 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="717de319-f1ef-4dd5-8e96-3b7087f6a83d" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468028 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee6bb73-8de2-47ac-a53c-d61f41701f9b" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468042 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" containerName="openstackclient" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468055 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c545243b-63f7-4b3b-a18b-f7053cb83d93" containerName="registry-server" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.468740 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.482128 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.585294 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.586357 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.586477 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.587266 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="openstack-network-exporter" containerID="cri-o://92cc9452c05ea4b28722f96dbdaea7af0d12e4f63960239309ced2df5f5e288d" gracePeriod=300 Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.594854 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.595130 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" containerID="cri-o://bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" gracePeriod=30 Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.595260 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="openstack-network-exporter" containerID="cri-o://60ba5da4792a87b1525f0ffb79e190a4fc0bc794c1d89f435442a97dc33fe1b1" gracePeriod=30 Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.613941 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.645240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.645479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfzxz\" (UniqueName: \"kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.710733 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-qwhzb"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.722126 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-qwhzb"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.749118 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfzxz\" (UniqueName: \"kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.749336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.749366 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.749414 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j58pv\" (UniqueName: \"kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.751180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.841793 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-rsw7c"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.851188 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.851254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j58pv\" (UniqueName: \"kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.852256 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.853342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfzxz\" (UniqueName: \"kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz\") pod \"glance6ca1-account-delete-t8t45\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.890229 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-rsw7c"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.896791 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.898080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.914431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j58pv\" (UniqueName: \"kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv\") pod \"cinder2a5d-account-delete-tbzqq\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.939683 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.966730 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:34 crc kubenswrapper[4764]: E1204 00:04:34.967026 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data podName:bda43f61-31ae-4c4c-967e-f0e8d13f5ae9 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:35.967007971 +0000 UTC m=+1411.728332382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data") pod "rabbitmq-cell1-server-0" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9") : configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.967889 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:34 crc kubenswrapper[4764]: I1204 00:04:34.992931 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="ovsdbserver-nb" containerID="cri-o://848bf827a01a5b8980ad8035d235310421f22cc7cf5bb62a1d3f4398c2863351" gracePeriod=300 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.058854 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.060251 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.078858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.078911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.078967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dsls\" (UniqueName: \"kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.079024 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg25z\" (UniqueName: \"kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.095187 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.103048 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.120090 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.187460 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dsls\" (UniqueName: \"kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.187546 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg25z\" (UniqueName: \"kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.187612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.187648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.188338 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.189090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.217312 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dlttd"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.247401 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dlttd"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.255826 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dsls\" (UniqueName: \"kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls\") pod \"placement1158-account-delete-dqsj8\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.263896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg25z\" (UniqueName: \"kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z\") pod \"barbicanc616-account-delete-ls8lk\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.291486 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-js842"] Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.293601 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.293641 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data podName:76708e9b-1db4-42ca-94d2-7ff96d08d855 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:35.793630075 +0000 UTC m=+1411.554954486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data") pod "rabbitmq-server-0" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855") : configmap "rabbitmq-config-data" not found Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.321002 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-js842"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.352767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.353038 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="dnsmasq-dns" containerID="cri-o://8af4acd3740dccbf24a933df87c06cee9c0c919aae4bd8f32b080872dd5b0c80" gracePeriod=10 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.370891 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.380987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.385016 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.407292 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.417003 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.445624 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.476385 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.516616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t76xs\" (UniqueName: \"kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.624011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.654165 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7hd69"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.711173 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7hd69"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.727354 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t76xs\" (UniqueName: \"kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.727690 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.728445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.809364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t76xs\" (UniqueName: \"kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs\") pod \"neutron4ef6-account-delete-l854j\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.823435 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.823662 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-8zvg9" podUID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" containerName="openstack-network-exporter" containerID="cri-o://6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.831908 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.832066 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data podName:76708e9b-1db4-42ca-94d2-7ff96d08d855 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:36.832029574 +0000 UTC m=+1412.593353985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data") pod "rabbitmq-server-0" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855") : configmap "rabbitmq-config-data" not found Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.840262 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-8gb4t"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.856361 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-8gb4t"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.863924 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fcec7a86-cb5c-49e9-af77-30958d09c359/ovsdbserver-nb/0.log" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.863971 4764 generic.go:334] "Generic (PLEG): container finished" podID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerID="92cc9452c05ea4b28722f96dbdaea7af0d12e4f63960239309ced2df5f5e288d" exitCode=2 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.864002 4764 generic.go:334] "Generic (PLEG): container finished" podID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerID="848bf827a01a5b8980ad8035d235310421f22cc7cf5bb62a1d3f4398c2863351" exitCode=143 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.864044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerDied","Data":"92cc9452c05ea4b28722f96dbdaea7af0d12e4f63960239309ced2df5f5e288d"} Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.864084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerDied","Data":"848bf827a01a5b8980ad8035d235310421f22cc7cf5bb62a1d3f4398c2863351"} Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.865761 4764 generic.go:334] "Generic (PLEG): container finished" podID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerID="60ba5da4792a87b1525f0ffb79e190a4fc0bc794c1d89f435442a97dc33fe1b1" exitCode=2 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.865799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerDied","Data":"60ba5da4792a87b1525f0ffb79e190a4fc0bc794c1d89f435442a97dc33fe1b1"} Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.867791 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.868067 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-log" containerID="cri-o://2398fc991b15ae90f82ce48cda9c9af81acdb7bfc25b1505a825f6d849c9810b" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.868500 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-httpd" containerID="cri-o://94f6b973870b4eea1c81fd05faef27612f6926a743b37cea298cec8baf77cde6" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.884063 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerID="8af4acd3740dccbf24a933df87c06cee9c0c919aae4bd8f32b080872dd5b0c80" exitCode=0 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.884096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" event={"ID":"e6152d07-38d3-42e7-953f-d9747b1f8996","Type":"ContainerDied","Data":"8af4acd3740dccbf24a933df87c06cee9c0c919aae4bd8f32b080872dd5b0c80"} Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.893289 4764 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-l2vv9" message=< Dec 04 00:04:35 crc kubenswrapper[4764]: Exiting ovn-controller (1) [ OK ] Dec 04 00:04:35 crc kubenswrapper[4764]: > Dec 04 00:04:35 crc kubenswrapper[4764]: E1204 00:04:35.893324 4764 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-l2vv9" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" containerID="cri-o://f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb" Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.893358 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-l2vv9" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" containerID="cri-o://f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.932308 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.933679 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-log" containerID="cri-o://4d9413685b93b99db1f424005b4c8e8740303642c0922ed2d65d664c8191da2b" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.935429 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-httpd" containerID="cri-o://9063bab53ed33482cd89acc542fa22e753f09e7ceb951b53766f7dc55ea3fcda" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.972116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.972363 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="cinder-scheduler" containerID="cri-o://efeadb11f0ef1de95143824a054c252e1663f1c06d14e2f5070aa509d85dd5bf" gracePeriod=30 Dec 04 00:04:35 crc kubenswrapper[4764]: I1204 00:04:35.972809 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="probe" containerID="cri-o://9a0f09120101a0329c1149dc889b69fb1f61190068470f9d33bdfee1ebb1a78d" gracePeriod=30 Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.037657 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.037724 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data podName:bda43f61-31ae-4c4c-967e-f0e8d13f5ae9 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:38.037700659 +0000 UTC m=+1413.799025070 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data") pod "rabbitmq-cell1-server-0" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9") : configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.068538 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.068786 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api-log" containerID="cri-o://4dec83b8647040009bb8b20db48c59cfdae71ee1a3fa1d5ef147201319666a80" gracePeriod=30 Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.069132 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api" containerID="cri-o://6ef4634d4e9a70890a62dc5bc5ec2d0dea18b5551be672ee6677a592a96cead8" gracePeriod=30 Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.085386 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.086704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.112231 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.136157 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.198567 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bplfc"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.247954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.248030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq64z\" (UniqueName: \"kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.270612 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.298658 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.298935 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bplfc"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.307760 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fcec7a86-cb5c-49e9-af77-30958d09c359/ovsdbserver-nb/0.log" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.307826 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.309821 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:36 crc kubenswrapper[4764]: E1204 00:04:36.309865 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.349974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.350053 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq64z\" (UniqueName: \"kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.354874 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.373078 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4mnsn"] Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.383980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq64z\" (UniqueName: \"kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z\") pod \"novaapifdfe-account-delete-lncz4\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:36 crc kubenswrapper[4764]: I1204 00:04:36.388358 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.388862 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4mnsn"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.397165 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.397527 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="dnsmasq-dns" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.397540 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="dnsmasq-dns" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.397551 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="openstack-network-exporter" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.397557 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="openstack-network-exporter" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.397585 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="init" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.397591 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="init" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.397600 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="ovsdbserver-nb" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.397606 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="ovsdbserver-nb" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.398643 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="openstack-network-exporter" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.398674 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" containerName="ovsdbserver-nb" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.398689 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="dnsmasq-dns" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.399279 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.414614 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.427979 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.428504 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-server" containerID="cri-o://229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.428939 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="swift-recon-cron" containerID="cri-o://65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.428996 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="rsync" containerID="cri-o://a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429039 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-expirer" containerID="cri-o://79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429072 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-updater" containerID="cri-o://bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429102 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-auditor" containerID="cri-o://b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429132 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-replicator" containerID="cri-o://3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-server" containerID="cri-o://db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-server" containerID="cri-o://8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429212 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-auditor" containerID="cri-o://bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429233 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-auditor" containerID="cri-o://5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429251 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-replicator" containerID="cri-o://ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429259 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-replicator" containerID="cri-o://9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429203 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-reaper" containerID="cri-o://49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.429293 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-updater" containerID="cri-o://6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452532 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4srwm\" (UniqueName: \"kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452595 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452683 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452815 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.452889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts\") pod \"fcec7a86-cb5c-49e9-af77-30958d09c359\" (UID: \"fcec7a86-cb5c-49e9-af77-30958d09c359\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.458620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts" (OuterVolumeSpecName: "scripts") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.459593 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config" (OuterVolumeSpecName: "config") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.462906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.462981 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.470872 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d45ff9d86-725zf" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-log" containerID="cri-o://f79070c4af81fb3ba806ca5d2c61d64116e1765e388c3c78534b5f9ef1cd7663" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.470929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.471019 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6d45ff9d86-725zf" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-api" containerID="cri-o://c0f14891d1b59f0d4bb85f831e2a4b7f44911359e51183f14fe60719afd8d989" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.489372 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.490128 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="openstack-network-exporter" containerID="cri-o://daaf88f8c3b80eeebebe4394604c8c9052727fcd26fe8a3c91f09babde9cc83e" gracePeriod=300 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.497777 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm" (OuterVolumeSpecName: "kube-api-access-4srwm") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "kube-api-access-4srwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.502792 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.503021 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c96d99869-mwjrh" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-api" containerID="cri-o://f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.503467 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c96d99869-mwjrh" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-httpd" containerID="cri-o://a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.529642 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554592 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2qtl\" (UniqueName: \"kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554877 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.554911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb\") pod \"e6152d07-38d3-42e7-953f-d9747b1f8996\" (UID: \"e6152d07-38d3-42e7-953f-d9747b1f8996\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnplj\" (UniqueName: \"kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555346 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4srwm\" (UniqueName: \"kubernetes.io/projected/fcec7a86-cb5c-49e9-af77-30958d09c359-kube-api-access-4srwm\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555359 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555367 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555387 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.555398 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcec7a86-cb5c-49e9-af77-30958d09c359-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.566931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl" (OuterVolumeSpecName: "kube-api-access-n2qtl") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "kube-api-access-n2qtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.581200 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.598592 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="ovsdbserver-sb" containerID="cri-o://5ff0a10729c25996e23b4567c55f3aef2507de83dbe93682e1eebbaeef977a9f" gracePeriod=300 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.615594 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee4e1b8-616d-469c-988d-f371d65275d9" path="/var/lib/kubelet/pods/1ee4e1b8-616d-469c-988d-f371d65275d9/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.616585 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4312db12-846e-4bc4-8f2f-7121ac50776d" path="/var/lib/kubelet/pods/4312db12-846e-4bc4-8f2f-7121ac50776d/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.617599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f46018a-c0f3-4feb-9a18-d3a8e80d3ded" path="/var/lib/kubelet/pods/5f46018a-c0f3-4feb-9a18-d3a8e80d3ded/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.618961 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c4fa015-bd9b-44c9-a09b-41630154ec52" path="/var/lib/kubelet/pods/6c4fa015-bd9b-44c9-a09b-41630154ec52/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.619630 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959bf4d2-5d71-4a02-a0a0-8c417c2a7d31" path="/var/lib/kubelet/pods/959bf4d2-5d71-4a02-a0a0-8c417c2a7d31/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.620692 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d131b557-f02e-4925-a9eb-52202bce1b00" path="/var/lib/kubelet/pods/d131b557-f02e-4925-a9eb-52202bce1b00/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.622139 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d15434b3-f4f4-4e53-8ddd-db0df89aca8a" path="/var/lib/kubelet/pods/d15434b3-f4f4-4e53-8ddd-db0df89aca8a/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.624880 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a" path="/var/lib/kubelet/pods/e19a66e3-3521-4e8d-b4ff-6a9ef5003a8a/volumes" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.630382 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.630562 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" containerID="cri-o://506954a3df1b596b6cc009eafe6c0378475f7b74778bfc77c7cd68e0dfd9aa9d" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.630895 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" containerID="cri-o://6f3ad0a68a4b98fc593b43b878fbce89575024644243ce672d381c81a0dabf6a" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.632356 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.632656 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener-log" containerID="cri-o://24fed489eecbcfbdd5a929e01e1c310bcf42cb3baa89d27eff4bf18f0bf16997" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.632727 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener" containerID="cri-o://71b9323307a4b2c17ee25a0cbf4f507e0f54dcd057e06354d3d97cbd5b67d385" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnplj\" (UniqueName: \"kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657182 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2qtl\" (UniqueName: \"kubernetes.io/projected/e6152d07-38d3-42e7-953f-d9747b1f8996-kube-api-access-n2qtl\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657209 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657448 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54ddd476ff-9v8dj" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker-log" containerID="cri-o://da01bc5b68c5d213728724e481b6998adc831fb8b0fab47de60acb606240ad85" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.657532 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-54ddd476ff-9v8dj" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker" containerID="cri-o://05700e796c28abd13f4c5635747b2b007e49376c6c97684f76cd88c2347348c6" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.666343 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.688447 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnplj\" (UniqueName: \"kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj\") pod \"novacell04790-account-delete-g9286\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.693913 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.694102 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-log" containerID="cri-o://32bfd6d3548d7ed75468d396b935e173001398b11d776ac46fbc7e65ee2ad928" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.694280 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-api" containerID="cri-o://bfa789db8eedd550d660743c753b3bea2fab2bce89eb7947314062414fa5026a" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.703098 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.703300 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-789dfd9c8d-k4z96" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api-log" containerID="cri-o://5523cc0c69a6274103cea6cdb99c2b0cb069c2b4434f1a09627b39395825d92d" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.703706 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-789dfd9c8d-k4z96" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api" containerID="cri-o://0f1e5405b57025512e61585a9e9a3c74dacc900d7181ee5cacc158e3f86552fc" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.714516 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.724477 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8zvg9_c98b9272-87ec-43a2-97a7-7f08cdafbf2c/openstack-network-exporter/0.log" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.724924 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8zvg9" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.754191 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.762480 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" containerID="cri-o://2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" gracePeriod=29 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.773612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.778674 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.791146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.801093 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.820821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config" (OuterVolumeSpecName: "config") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.824380 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.824462 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.824508 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.821450 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2e9e-account-create-update-6nxlv"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.833942 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="rabbitmq" containerID="cri-o://0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf" gracePeriod=604800 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.868887 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2e9e-account-create-update-6nxlv"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.871741 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.886629 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-m4ntf"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.893524 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-m4ntf"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.896877 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrx6k"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.912480 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hrx6k"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.913472 4764 generic.go:334] "Generic (PLEG): container finished" podID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerID="506954a3df1b596b6cc009eafe6c0378475f7b74778bfc77c7cd68e0dfd9aa9d" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.913508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerDied","Data":"506954a3df1b596b6cc009eafe6c0378475f7b74778bfc77c7cd68e0dfd9aa9d"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.914693 4764 generic.go:334] "Generic (PLEG): container finished" podID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerID="9a0f09120101a0329c1149dc889b69fb1f61190068470f9d33bdfee1ebb1a78d" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.914736 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerDied","Data":"9a0f09120101a0329c1149dc889b69fb1f61190068470f9d33bdfee1ebb1a78d"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915629 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8zvg9_c98b9272-87ec-43a2-97a7-7f08cdafbf2c/openstack-network-exporter/0.log" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915650 4764 generic.go:334] "Generic (PLEG): container finished" podID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" containerID="6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a" exitCode=2 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8zvg9" event={"ID":"c98b9272-87ec-43a2-97a7-7f08cdafbf2c","Type":"ContainerDied","Data":"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8zvg9" event={"ID":"c98b9272-87ec-43a2-97a7-7f08cdafbf2c","Type":"ContainerDied","Data":"5cb902c52acaaba29ea9395d05b55ee1063eefcf74a2f271bef160e7314767c6"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915708 4764 scope.go:117] "RemoveContainer" containerID="6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.915848 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8zvg9" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.918957 4764 generic.go:334] "Generic (PLEG): container finished" podID="8499c909-53fe-4742-aa11-29e214451689" containerID="a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.918996 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerDied","Data":"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.919837 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.919961 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.921149 4764 generic.go:334] "Generic (PLEG): container finished" podID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerID="24fed489eecbcfbdd5a929e01e1c310bcf42cb3baa89d27eff4bf18f0bf16997" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.921183 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerDied","Data":"24fed489eecbcfbdd5a929e01e1c310bcf42cb3baa89d27eff4bf18f0bf16997"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925681 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925883 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.925991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926013 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926037 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7mgj\" (UniqueName: \"kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj\") pod \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\" (UID: \"c98b9272-87ec-43a2-97a7-7f08cdafbf2c\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926415 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926427 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926436 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926444 4764 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.926516 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.926563 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data podName:76708e9b-1db4-42ca-94d2-7ff96d08d855 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:38.92654677 +0000 UTC m=+1414.687871181 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data") pod "rabbitmq-server-0" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855") : configmap "rabbitmq-config-data" not found Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.926980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config" (OuterVolumeSpecName: "config") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.927191 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerID="2398fc991b15ae90f82ce48cda9c9af81acdb7bfc25b1505a825f6d849c9810b" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.927652 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerDied","Data":"2398fc991b15ae90f82ce48cda9c9af81acdb7bfc25b1505a825f6d849c9810b"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.932160 4764 generic.go:334] "Generic (PLEG): container finished" podID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerID="32bfd6d3548d7ed75468d396b935e173001398b11d776ac46fbc7e65ee2ad928" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.932208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerDied","Data":"32bfd6d3548d7ed75468d396b935e173001398b11d776ac46fbc7e65ee2ad928"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.933090 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.933295 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="155f4570-7769-42ab-8bc0-168dba070531" containerName="nova-cell1-conductor-conductor" containerID="cri-o://bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.936661 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249c3a6b-9345-49ed-9b2d-a0991fb02dc0/ovsdbserver-sb/0.log" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.936692 4764 generic.go:334] "Generic (PLEG): container finished" podID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerID="daaf88f8c3b80eeebebe4394604c8c9052727fcd26fe8a3c91f09babde9cc83e" exitCode=2 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.936703 4764 generic.go:334] "Generic (PLEG): container finished" podID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerID="5ff0a10729c25996e23b4567c55f3aef2507de83dbe93682e1eebbaeef977a9f" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.936757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerDied","Data":"daaf88f8c3b80eeebebe4394604c8c9052727fcd26fe8a3c91f09babde9cc83e"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.936776 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerDied","Data":"5ff0a10729c25996e23b4567c55f3aef2507de83dbe93682e1eebbaeef977a9f"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.939194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.940099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj" (OuterVolumeSpecName: "kube-api-access-p7mgj") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "kube-api-access-p7mgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.941780 4764 generic.go:334] "Generic (PLEG): container finished" podID="3a7dd687-d272-4102-bc70-199b44353a21" containerID="da01bc5b68c5d213728724e481b6998adc831fb8b0fab47de60acb606240ad85" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.941845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerDied","Data":"da01bc5b68c5d213728724e481b6998adc831fb8b0fab47de60acb606240ad85"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.943533 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.943586 4764 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 04 00:04:37 crc kubenswrapper[4764]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 00:04:37 crc kubenswrapper[4764]: + source /usr/local/bin/container-scripts/functions Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNBridge=br-int Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNRemote=tcp:localhost:6642 Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNEncapType=geneve Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNAvailabilityZones= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ EnableChassisAsGateway=true Dec 04 00:04:37 crc kubenswrapper[4764]: ++ PhysicalNetworks= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNHostName= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 00:04:37 crc kubenswrapper[4764]: ++ ovs_dir=/var/lib/openvswitch Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 00:04:37 crc kubenswrapper[4764]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + cleanup_ovsdb_server_semaphore Dec 04 00:04:37 crc kubenswrapper[4764]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 00:04:37 crc kubenswrapper[4764]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-xnsqq" message=< Dec 04 00:04:37 crc kubenswrapper[4764]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 00:04:37 crc kubenswrapper[4764]: + source /usr/local/bin/container-scripts/functions Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNBridge=br-int Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNRemote=tcp:localhost:6642 Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNEncapType=geneve Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNAvailabilityZones= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ EnableChassisAsGateway=true Dec 04 00:04:37 crc kubenswrapper[4764]: ++ PhysicalNetworks= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNHostName= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 00:04:37 crc kubenswrapper[4764]: ++ ovs_dir=/var/lib/openvswitch Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 00:04:37 crc kubenswrapper[4764]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + cleanup_ovsdb_server_semaphore Dec 04 00:04:37 crc kubenswrapper[4764]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 00:04:37 crc kubenswrapper[4764]: > Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:36.943856 4764 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 04 00:04:37 crc kubenswrapper[4764]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 04 00:04:37 crc kubenswrapper[4764]: + source /usr/local/bin/container-scripts/functions Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNBridge=br-int Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNRemote=tcp:localhost:6642 Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNEncapType=geneve Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNAvailabilityZones= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ EnableChassisAsGateway=true Dec 04 00:04:37 crc kubenswrapper[4764]: ++ PhysicalNetworks= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ OVNHostName= Dec 04 00:04:37 crc kubenswrapper[4764]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 04 00:04:37 crc kubenswrapper[4764]: ++ ovs_dir=/var/lib/openvswitch Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 04 00:04:37 crc kubenswrapper[4764]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 04 00:04:37 crc kubenswrapper[4764]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + sleep 0.5 Dec 04 00:04:37 crc kubenswrapper[4764]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 04 00:04:37 crc kubenswrapper[4764]: + cleanup_ovsdb_server_semaphore Dec 04 00:04:37 crc kubenswrapper[4764]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 04 00:04:37 crc kubenswrapper[4764]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 04 00:04:37 crc kubenswrapper[4764]: > pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" containerID="cri-o://72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.943902 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" containerID="cri-o://72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" gracePeriod=29 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.943599 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="rabbitmq" containerID="cri-o://699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972" gracePeriod=604800 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.954033 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h7p67"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.957483 4764 generic.go:334] "Generic (PLEG): container finished" podID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerID="5523cc0c69a6274103cea6cdb99c2b0cb069c2b4434f1a09627b39395825d92d" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.957582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerDied","Data":"5523cc0c69a6274103cea6cdb99c2b0cb069c2b4434f1a09627b39395825d92d"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.965398 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h7p67"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.973647 4764 generic.go:334] "Generic (PLEG): container finished" podID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" containerID="563686178a29a5fa1144486490dc7517a511b76ab0bc8be6b06e40785fc8ba8a" exitCode=137 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.987066 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.987276 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://49b1d45450edbc5e515da4d8f049c2433b6679ef6419dc9e04e61f99fccf319b" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.991022 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_fcec7a86-cb5c-49e9-af77-30958d09c359/ovsdbserver-nb/0.log" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.991855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"fcec7a86-cb5c-49e9-af77-30958d09c359","Type":"ContainerDied","Data":"0bb1361cc63c0a90e857adf49e76d7b94911883caf7741f4d64321925645a31c"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:36.992034 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.001934 4764 generic.go:334] "Generic (PLEG): container finished" podID="3cad4f7f-7546-406c-822b-b6f77365d830" containerID="f79070c4af81fb3ba806ca5d2c61d64116e1765e388c3c78534b5f9ef1cd7663" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.002011 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerDied","Data":"f79070c4af81fb3ba806ca5d2c61d64116e1765e388c3c78534b5f9ef1cd7663"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.011557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6152d07-38d3-42e7-953f-d9747b1f8996" (UID: "e6152d07-38d3-42e7-953f-d9747b1f8996"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.011898 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.012104 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0baff485-3721-45b5-9177-96c30ce03251" containerName="nova-scheduler-scheduler" containerID="cri-o://c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.015919 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" event={"ID":"e6152d07-38d3-42e7-953f-d9747b1f8996","Type":"ContainerDied","Data":"0f0dd0efe85778b87bfd31a1306fdfebd9558b21164f1bb2bd1dc32836c34c69"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.016034 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.024645 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fcec7a86-cb5c-49e9-af77-30958d09c359" (UID: "fcec7a86-cb5c-49e9-af77-30958d09c359"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.038141 4764 generic.go:334] "Generic (PLEG): container finished" podID="7323df53-27cc-46a0-ad81-1e916db379af" containerID="4dec83b8647040009bb8b20db48c59cfdae71ee1a3fa1d5ef147201319666a80" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.038229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerDied","Data":"4dec83b8647040009bb8b20db48c59cfdae71ee1a3fa1d5ef147201319666a80"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047608 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047647 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047663 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047675 4764 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6152d07-38d3-42e7-953f-d9747b1f8996-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047686 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcec7a86-cb5c-49e9-af77-30958d09c359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.047696 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7mgj\" (UniqueName: \"kubernetes.io/projected/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-kube-api-access-p7mgj\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.059263 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066432 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066455 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066475 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066484 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066491 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066497 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066502 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066508 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066514 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066520 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066527 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066533 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066649 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066673 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.066682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.076126 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerID="f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb" exitCode=0 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.076190 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9" event={"ID":"ab264d6c-eecf-496f-b505-39b128dd8e44","Type":"ContainerDied","Data":"f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.076294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l2vv9" event={"ID":"ab264d6c-eecf-496f-b505-39b128dd8e44","Type":"ContainerDied","Data":"e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.076309 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34ea702b8e802e242113cae6f79d16871096f4dc8404bf566245c3eb62369dc" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.078090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.086565 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerID="4d9413685b93b99db1f424005b4c8e8740303642c0922ed2d65d664c8191da2b" exitCode=143 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.086605 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerDied","Data":"4d9413685b93b99db1f424005b4c8e8740303642c0922ed2d65d664c8191da2b"} Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.149638 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.154851 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.158884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c98b9272-87ec-43a2-97a7-7f08cdafbf2c" (UID: "c98b9272-87ec-43a2-97a7-7f08cdafbf2c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.165377 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.172941 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f9fbbf6f7-znhdg"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.178436 4764 scope.go:117] "RemoveContainer" containerID="6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.188261 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a\": container with ID starting with 6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a not found: ID does not exist" containerID="6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.188302 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a"} err="failed to get container status \"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a\": rpc error: code = NotFound desc = could not find container \"6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a\": container with ID starting with 6cd92de4172616a51a3c817fb38e35c3dd801a69abe294f9837a3d79d44e0d1a not found: ID does not exist" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.188329 4764 scope.go:117] "RemoveContainer" containerID="92cc9452c05ea4b28722f96dbdaea7af0d12e4f63960239309ced2df5f5e288d" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.206757 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.252773 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.252883 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.252905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2fbx\" (UniqueName: \"kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.252924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.252945 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.253029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.253047 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn\") pod \"ab264d6c-eecf-496f-b505-39b128dd8e44\" (UID: \"ab264d6c-eecf-496f-b505-39b128dd8e44\") " Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.253529 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98b9272-87ec-43a2-97a7-7f08cdafbf2c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.253574 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.254149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts" (OuterVolumeSpecName: "scripts") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.254209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run" (OuterVolumeSpecName: "var-run") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.254630 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.265851 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.268369 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.268757 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.272191 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.273889 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.273965 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.273986 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.276023 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.276197 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6677596fcf-6rh2n" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-httpd" containerID="cri-o://3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.276473 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6677596fcf-6rh2n" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-server" containerID="cri-o://7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.287351 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx" (OuterVolumeSpecName: "kube-api-access-f2fbx") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "kube-api-access-f2fbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.297592 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:37 crc kubenswrapper[4764]: E1204 00:04:37.297650 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.301089 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.314694 4764 scope.go:117] "RemoveContainer" containerID="848bf827a01a5b8980ad8035d235310421f22cc7cf5bb62a1d3f4398c2863351" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361836 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361866 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab264d6c-eecf-496f-b505-39b128dd8e44-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361878 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2fbx\" (UniqueName: \"kubernetes.io/projected/ab264d6c-eecf-496f-b505-39b128dd8e44-kube-api-access-f2fbx\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361888 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361896 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.361907 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ab264d6c-eecf-496f-b505-39b128dd8e44-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.379809 4764 scope.go:117] "RemoveContainer" containerID="8af4acd3740dccbf24a933df87c06cee9c0c919aae4bd8f32b080872dd5b0c80" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.400376 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.408556 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-8zvg9"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.414150 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.421666 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.438373 4764 scope.go:117] "RemoveContainer" containerID="1a9344597699f53d84c39d5a1632513c0b73908012de706938aa4ae664672a32" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.450852 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="galera" containerID="cri-o://6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4" gracePeriod=30 Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.460623 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "ab264d6c-eecf-496f-b505-39b128dd8e44" (UID: "ab264d6c-eecf-496f-b505-39b128dd8e44"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:37 crc kubenswrapper[4764]: I1204 00:04:37.464354 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab264d6c-eecf-496f-b505-39b128dd8e44-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.027889 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.037976 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249c3a6b-9345-49ed-9b2d-a0991fb02dc0/ovsdbserver-sb/0.log" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.038040 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.051528 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.092898 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.092997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093043 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret\") pod \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093185 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093246 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle\") pod \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093315 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config\") pod \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093371 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56kpm\" (UniqueName: \"kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm\") pod \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\" (UID: \"8291acae-68d4-4e14-b0a7-40d026ff1cb2\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.093408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7jh7\" (UniqueName: \"kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7\") pod \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\" (UID: \"249c3a6b-9345-49ed-9b2d-a0991fb02dc0\") " Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.095192 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.095248 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data podName:bda43f61-31ae-4c4c-967e-f0e8d13f5ae9 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:42.095234542 +0000 UTC m=+1417.856558953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data") pod "rabbitmq-cell1-server-0" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9") : configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.100066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.112615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7" (OuterVolumeSpecName: "kube-api-access-x7jh7") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "kube-api-access-x7jh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.113391 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts" (OuterVolumeSpecName: "scripts") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.123609 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config" (OuterVolumeSpecName: "config") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.128615 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.138390 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_249c3a6b-9345-49ed-9b2d-a0991fb02dc0/ovsdbserver-sb/0.log" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.138464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"249c3a6b-9345-49ed-9b2d-a0991fb02dc0","Type":"ContainerDied","Data":"115299cd95d9c0ccbefe3eb31bf8c5ba467cfd3f82d8ea997f1fed3d53c0c123"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.138508 4764 scope.go:117] "RemoveContainer" containerID="daaf88f8c3b80eeebebe4394604c8c9052727fcd26fe8a3c91f09babde9cc83e" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.138663 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.139693 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm" (OuterVolumeSpecName: "kube-api-access-56kpm") pod "8291acae-68d4-4e14-b0a7-40d026ff1cb2" (UID: "8291acae-68d4-4e14-b0a7-40d026ff1cb2"). InnerVolumeSpecName "kube-api-access-56kpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.181201 4764 generic.go:334] "Generic (PLEG): container finished" podID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerID="49b1d45450edbc5e515da4d8f049c2433b6679ef6419dc9e04e61f99fccf319b" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.181358 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8291acae-68d4-4e14-b0a7-40d026ff1cb2" (UID: "8291acae-68d4-4e14-b0a7-40d026ff1cb2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.181412 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74af5cde-29d3-4ff7-803b-fb335fc8209c","Type":"ContainerDied","Data":"49b1d45450edbc5e515da4d8f049c2433b6679ef6419dc9e04e61f99fccf319b"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198321 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198353 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198363 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56kpm\" (UniqueName: \"kubernetes.io/projected/8291acae-68d4-4e14-b0a7-40d026ff1cb2-kube-api-access-56kpm\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198371 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7jh7\" (UniqueName: \"kubernetes.io/projected/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-kube-api-access-x7jh7\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198391 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198400 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.198407 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.203414 4764 generic.go:334] "Generic (PLEG): container finished" podID="047e426c-4178-43ce-8a09-ff5b4a6a13f1" containerID="bb2e7ee1217f8d3f46ff3a25bdc283e42671a7b464b00ec6b61dc689fa57b84b" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.203486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2a5d-account-delete-tbzqq" event={"ID":"047e426c-4178-43ce-8a09-ff5b4a6a13f1","Type":"ContainerDied","Data":"bb2e7ee1217f8d3f46ff3a25bdc283e42671a7b464b00ec6b61dc689fa57b84b"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.203518 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2a5d-account-delete-tbzqq" event={"ID":"047e426c-4178-43ce-8a09-ff5b4a6a13f1","Type":"ContainerStarted","Data":"9bbc95a4750dd933a869b96126183ba09d4a84e99d7126b4f43c1e98966d06fc"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.208652 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.222888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8291acae-68d4-4e14-b0a7-40d026ff1cb2" (UID: "8291acae-68d4-4e14-b0a7-40d026ff1cb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.250174 4764 generic.go:334] "Generic (PLEG): container finished" podID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerID="3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.250278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.250306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerDied","Data":"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.268054 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.292968 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.293018 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.293430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.293467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.299669 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.299695 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.300354 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.302924 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f850034-7f6e-4811-b98f-89648c559dcd" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" exitCode=0 Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.302994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerDied","Data":"72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.303548 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.306529 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.310587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance6ca1-account-delete-t8t45" event={"ID":"aa27bce9-febf-497d-ad48-21b087064f34","Type":"ContainerStarted","Data":"14d73ef1eb7d96708f245df905b00f90430c0778b19499b78a8419a07a2e0e89"} Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.310643 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l2vv9" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.315480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "249c3a6b-9345-49ed-9b2d-a0991fb02dc0" (UID: "249c3a6b-9345-49ed-9b2d-a0991fb02dc0"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.354254 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.358648 4764 scope.go:117] "RemoveContainer" containerID="5ff0a10729c25996e23b4567c55f3aef2507de83dbe93682e1eebbaeef977a9f" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.361422 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8291acae-68d4-4e14-b0a7-40d026ff1cb2" (UID: "8291acae-68d4-4e14-b0a7-40d026ff1cb2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.406224 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.406557 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/249c3a6b-9345-49ed-9b2d-a0991fb02dc0-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.406568 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8291acae-68d4-4e14-b0a7-40d026ff1cb2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.417065 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.424119 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l2vv9"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.434453 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.473318 4764 scope.go:117] "RemoveContainer" containerID="563686178a29a5fa1144486490dc7517a511b76ab0bc8be6b06e40785fc8ba8a" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.497180 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.503632 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.508654 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sck9\" (UniqueName: \"kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9\") pod \"74af5cde-29d3-4ff7-803b-fb335fc8209c\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.508733 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data\") pod \"74af5cde-29d3-4ff7-803b-fb335fc8209c\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.508792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs\") pod \"74af5cde-29d3-4ff7-803b-fb335fc8209c\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.508825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle\") pod \"74af5cde-29d3-4ff7-803b-fb335fc8209c\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.508905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs\") pod \"74af5cde-29d3-4ff7-803b-fb335fc8209c\" (UID: \"74af5cde-29d3-4ff7-803b-fb335fc8209c\") " Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.531285 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9" (OuterVolumeSpecName: "kube-api-access-9sck9") pod "74af5cde-29d3-4ff7-803b-fb335fc8209c" (UID: "74af5cde-29d3-4ff7-803b-fb335fc8209c"). InnerVolumeSpecName "kube-api-access-9sck9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.576047 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74af5cde-29d3-4ff7-803b-fb335fc8209c" (UID: "74af5cde-29d3-4ff7-803b-fb335fc8209c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.584955 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data" (OuterVolumeSpecName: "config-data") pod "74af5cde-29d3-4ff7-803b-fb335fc8209c" (UID: "74af5cde-29d3-4ff7-803b-fb335fc8209c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.594026 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2012c377-0bc0-43e3-a919-6b6f753d9dde" path="/var/lib/kubelet/pods/2012c377-0bc0-43e3-a919-6b6f753d9dde/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.595440 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" path="/var/lib/kubelet/pods/249c3a6b-9345-49ed-9b2d-a0991fb02dc0/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.596228 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8e96e1-9fbf-416e-a8ab-91b0f8f98946" path="/var/lib/kubelet/pods/2e8e96e1-9fbf-416e-a8ab-91b0f8f98946/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.597915 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" path="/var/lib/kubelet/pods/8291acae-68d4-4e14-b0a7-40d026ff1cb2/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.598601 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" path="/var/lib/kubelet/pods/ab264d6c-eecf-496f-b505-39b128dd8e44/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.599636 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" path="/var/lib/kubelet/pods/c98b9272-87ec-43a2-97a7-7f08cdafbf2c/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.608524 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbd1176f-8fbf-442a-98ed-293aff954480" path="/var/lib/kubelet/pods/cbd1176f-8fbf-442a-98ed-293aff954480/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.609348 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" path="/var/lib/kubelet/pods/e6152d07-38d3-42e7-953f-d9747b1f8996/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.609928 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a10db5-2c78-495e-b81c-d0c89e9425ac" path="/var/lib/kubelet/pods/f9a10db5-2c78-495e-b81c-d0c89e9425ac/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.611486 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcec7a86-cb5c-49e9-af77-30958d09c359" path="/var/lib/kubelet/pods/fcec7a86-cb5c-49e9-af77-30958d09c359/volumes" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.614371 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.614402 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sck9\" (UniqueName: \"kubernetes.io/projected/74af5cde-29d3-4ff7-803b-fb335fc8209c-kube-api-access-9sck9\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.614414 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.622273 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "74af5cde-29d3-4ff7-803b-fb335fc8209c" (UID: "74af5cde-29d3-4ff7-803b-fb335fc8209c"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.662888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "74af5cde-29d3-4ff7-803b-fb335fc8209c" (UID: "74af5cde-29d3-4ff7-803b-fb335fc8209c"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.716494 4764 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: I1204 00:04:38.716530 4764 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/74af5cde-29d3-4ff7-803b-fb335fc8209c-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.748211 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f is running failed: container process not found" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.752040 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f is running failed: container process not found" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.753436 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f is running failed: container process not found" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:38 crc kubenswrapper[4764]: E1204 00:04:38.753478 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="155f4570-7769-42ab-8bc0-168dba070531" containerName="nova-cell1-conductor-conductor" Dec 04 00:04:39 crc kubenswrapper[4764]: E1204 00:04:39.026008 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 00:04:39 crc kubenswrapper[4764]: E1204 00:04:39.026412 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data podName:76708e9b-1db4-42ca-94d2-7ff96d08d855 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:43.026375703 +0000 UTC m=+1418.787700114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data") pod "rabbitmq-server-0" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855") : configmap "rabbitmq-config-data" not found Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.149561 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.184287 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.184561 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-central-agent" containerID="cri-o://581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.184678 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="sg-core" containerID="cri-o://52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.184700 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-notification-agent" containerID="cri-o://6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.184829 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="proxy-httpd" containerID="cri-o://76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.242216 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.242461 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerName="kube-state-metrics" containerID="cri-o://f01aa606ef44b4da2e845d6aab65dfe450e2d390badf9f1d24c7c03e50b3beb3" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.287108 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331448 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331507 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331528 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331667 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.331744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrrfz\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz\") pod \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\" (UID: \"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.338453 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.339097 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.345901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.350900 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.354021 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.355167 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz" (OuterVolumeSpecName: "kube-api-access-lrrfz") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "kube-api-access-lrrfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.381674 4764 generic.go:334] "Generic (PLEG): container finished" podID="aa27bce9-febf-497d-ad48-21b087064f34" containerID="81e666919c04f56edfdbd9ce12296a18ba6926e9bd2b9f9641241de048362065" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.381969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance6ca1-account-delete-t8t45" event={"ID":"aa27bce9-febf-497d-ad48-21b087064f34","Type":"ContainerDied","Data":"81e666919c04f56edfdbd9ce12296a18ba6926e9bd2b9f9641241de048362065"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.406478 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4ef6-account-delete-l854j" event={"ID":"62ce6a22-4796-4b94-9c53-d3088cff26f1","Type":"ContainerStarted","Data":"74c4a4212866150be76b57688d67c2614af9fceaf40e41900e4aadcf71403b1b"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.407168 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.413262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"74af5cde-29d3-4ff7-803b-fb335fc8209c","Type":"ContainerDied","Data":"fdccedd548d43aa1a4310ba8a3024c12953b9bef6344491d42091024a15ca257"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.413320 4764 scope.go:117] "RemoveContainer" containerID="49b1d45450edbc5e515da4d8f049c2433b6679ef6419dc9e04e61f99fccf319b" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.413405 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzv2n\" (UniqueName: \"kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n\") pod \"0baff485-3721-45b5-9177-96c30ce03251\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433066 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data\") pod \"0baff485-3721-45b5-9177-96c30ce03251\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433090 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle\") pod \"155f4570-7769-42ab-8bc0-168dba070531\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433212 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle\") pod \"0baff485-3721-45b5-9177-96c30ce03251\" (UID: \"0baff485-3721-45b5-9177-96c30ce03251\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-992lt\" (UniqueName: \"kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt\") pod \"155f4570-7769-42ab-8bc0-168dba070531\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433315 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433363 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433407 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l78mv\" (UniqueName: \"kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433432 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data\") pod \"155f4570-7769-42ab-8bc0-168dba070531\" (UID: \"155f4570-7769-42ab-8bc0-168dba070531\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.433496 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.437550 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.437611 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle\") pod \"6160ab00-1691-41f8-9902-80d33e435770\" (UID: \"6160ab00-1691-41f8-9902-80d33e435770\") " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.438326 4764 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.438346 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.438357 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.438370 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrrfz\" (UniqueName: \"kubernetes.io/projected/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-kube-api-access-lrrfz\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.439509 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.439702 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.440302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.440964 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.444613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement1158-account-delete-dqsj8" event={"ID":"bd7a5353-be52-43e9-9490-530240b943fe","Type":"ContainerStarted","Data":"bea567c860a67687e511f5e3309950a0b1674743af37c3d8008a61251396a5e2"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.446616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.452390 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n" (OuterVolumeSpecName: "kube-api-access-rzv2n") pod "0baff485-3721-45b5-9177-96c30ce03251" (UID: "0baff485-3721-45b5-9177-96c30ce03251"). InnerVolumeSpecName "kube-api-access-rzv2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.456189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc616-account-delete-ls8lk" event={"ID":"05b8c157-de2d-4811-a625-1a77c3c7b37b","Type":"ContainerStarted","Data":"2af4ca914fe57db488f67f940ea99bb6dad3c032cc7e5065b17a93cceba79ced"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.486783 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.499642 4764 generic.go:334] "Generic (PLEG): container finished" podID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerID="efeadb11f0ef1de95143824a054c252e1663f1c06d14e2f5070aa509d85dd5bf" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.499709 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerDied","Data":"efeadb11f0ef1de95143824a054c252e1663f1c06d14e2f5070aa509d85dd5bf"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.526026 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.526252 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerName="memcached" containerID="cri-o://ac966fb4e19027f88b3b69616fbd7358921c916973249c52a6951d8a77e62d9f" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.535428 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-c2ls4"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.540902 4764 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.540932 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.540942 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6160ab00-1691-41f8-9902-80d33e435770-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.540953 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6160ab00-1691-41f8-9902-80d33e435770-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.540963 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzv2n\" (UniqueName: \"kubernetes.io/projected/0baff485-3721-45b5-9177-96c30ce03251-kube-api-access-rzv2n\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.560724 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt" (OuterVolumeSpecName: "kube-api-access-992lt") pod "155f4570-7769-42ab-8bc0-168dba070531" (UID: "155f4570-7769-42ab-8bc0-168dba070531"). InnerVolumeSpecName "kube-api-access-992lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.560816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv" (OuterVolumeSpecName: "kube-api-access-l78mv") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "kube-api-access-l78mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.575967 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kh5nb"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.585301 4764 generic.go:334] "Generic (PLEG): container finished" podID="6160ab00-1691-41f8-9902-80d33e435770" containerID="6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.585380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerDied","Data":"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.585406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6160ab00-1691-41f8-9902-80d33e435770","Type":"ContainerDied","Data":"62a1bed6b510cae08735450564fad85c90c569d930b22c9100c19e60d9ad3c00"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.585528 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.585983 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kh5nb"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.595710 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-c2ls4"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.609681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.611222 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.611490 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-754f454454-nb48r" podUID="803d2331-67a9-462d-9e22-09a112264732" containerName="keystone-api" containerID="cri-o://5ba3f5a666e85c1ab0ed9cf5640222917b29d19dae5da32e8c3a64bf079caafd" gracePeriod=30 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.617389 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.625997 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.672430 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.672959 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.188:8081/readyz\": dial tcp 10.217.0.188:8081: connect: connection refused" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.679423 4764 generic.go:334] "Generic (PLEG): container finished" podID="155f4570-7769-42ab-8bc0-168dba070531" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.679503 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"155f4570-7769-42ab-8bc0-168dba070531","Type":"ContainerDied","Data":"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.679526 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"155f4570-7769-42ab-8bc0-168dba070531","Type":"ContainerDied","Data":"0fc61c784df839e37e29bf8736b71d7cfdfc5c4e29c5d9138a7389d570e58bb4"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.679612 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.685606 4764 generic.go:334] "Generic (PLEG): container finished" podID="0baff485-3721-45b5-9177-96c30ce03251" containerID="c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.686075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0baff485-3721-45b5-9177-96c30ce03251","Type":"ContainerDied","Data":"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.686477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.687551 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0baff485-3721-45b5-9177-96c30ce03251","Type":"ContainerDied","Data":"c980cd93b501010ed400c99974b61793e3928cbcc6ffaebbe39267796eced2ae"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.695251 4764 generic.go:334] "Generic (PLEG): container finished" podID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerID="7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b" exitCode=0 Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.695461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.695513 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerDied","Data":"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.695541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6677596fcf-6rh2n" event={"ID":"62d2ebe6-a49b-4835-bac7-86fbf33bd6c7","Type":"ContainerDied","Data":"9e51ccda65fd06323a1fed7f40332cb10d519bc63c4d26e908da491229a063ed"} Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.696284 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-992lt\" (UniqueName: \"kubernetes.io/projected/155f4570-7769-42ab-8bc0-168dba070531-kube-api-access-992lt\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.697097 4764 scope.go:117] "RemoveContainer" containerID="6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.701279 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l78mv\" (UniqueName: \"kubernetes.io/projected/6160ab00-1691-41f8-9902-80d33e435770-kube-api-access-l78mv\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.701328 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.704068 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.162:8776/healthcheck\": read tcp 10.217.0.2:58818->10.217.0.162:8776: read: connection reset by peer" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.709143 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-ccgpx"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.732531 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-ccgpx"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.737393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.740848 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.746018 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6ca1-account-create-update-7rgls"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.765413 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6ca1-account-create-update-7rgls"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.787872 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-2tx5s"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.793148 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:44396->10.217.0.201:8775: read: connection reset by peer" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.793607 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.201:8775/\": read tcp 10.217.0.2:44382->10.217.0.201:8775: read: connection reset by peer" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.796522 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-2tx5s"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.802743 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3d02-account-create-update-97vbs"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.804195 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.808184 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3d02-account-create-update-97vbs"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.817668 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-sphs6"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.826558 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-sphs6"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.846438 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.869032 4764 scope.go:117] "RemoveContainer" containerID="ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.871695 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2a5d-account-create-update-7b2vr"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.881464 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2a5d-account-create-update-7b2vr"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.924870 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-g9l4t"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.944694 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-g9l4t"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.966853 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.975021 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c616-account-create-update-x6pww"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.984530 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c616-account-create-update-x6pww"] Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.995111 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-789dfd9c8d-k4z96" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:59990->10.217.0.157:9311: read: connection reset by peer" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.995295 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-789dfd9c8d-k4z96" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:59992->10.217.0.157:9311: read: connection reset by peer" Dec 04 00:04:39 crc kubenswrapper[4764]: I1204 00:04:39.995590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0baff485-3721-45b5-9177-96c30ce03251" (UID: "0baff485-3721-45b5-9177-96c30ce03251"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.013506 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.057958 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "155f4570-7769-42ab-8bc0-168dba070531" (UID: "155f4570-7769-42ab-8bc0-168dba070531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.102116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jzlcq"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.102126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data" (OuterVolumeSpecName: "config-data") pod "0baff485-3721-45b5-9177-96c30ce03251" (UID: "0baff485-3721-45b5-9177-96c30ce03251"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.115066 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0baff485-3721-45b5-9177-96c30ce03251-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.115092 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.117359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data" (OuterVolumeSpecName: "config-data") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.118452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.122682 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jzlcq"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.130738 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.151072 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data" (OuterVolumeSpecName: "config-data") pod "155f4570-7769-42ab-8bc0-168dba070531" (UID: "155f4570-7769-42ab-8bc0-168dba070531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.151175 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1158-account-create-update-vpcpl"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.151600 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.104:11211: connect: connection refused" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.159152 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.160848 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1158-account-create-update-vpcpl"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.168855 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.194248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" (UID: "62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.194337 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6160ab00-1691-41f8-9902-80d33e435770" (UID: "6160ab00-1691-41f8-9902-80d33e435770"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227098 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/155f4570-7769-42ab-8bc0-168dba070531-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227154 4764 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6160ab00-1691-41f8-9902-80d33e435770-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227167 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227178 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227211 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227223 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.227233 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.280768 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6bz55"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.292926 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6bz55"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.324892 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.328296 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4ef6-account-create-update-hgk4l"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.329491 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="galera" containerID="cri-o://40932f1f044feae057b1145cd8eb76e3370493aa44a7c5f0f8b568439dbde7ab" gracePeriod=30 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.334846 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4ef6-account-create-update-hgk4l"] Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.370926 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.372487 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.373424 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.373455 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerName="nova-cell0-conductor-conductor" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.431411 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6d45ff9d86-725zf" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.149:8778/\": dial tcp 10.217.0.149:8778: connect: connection refused" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.431759 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-6d45ff9d86-725zf" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.149:8778/\": dial tcp 10.217.0.149:8778: connect: connection refused" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.646582 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1258d21b-9d01-4f1c-9508-ec94292425eb" path="/var/lib/kubelet/pods/1258d21b-9d01-4f1c-9508-ec94292425eb/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.648121 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294a0599-742d-479d-9758-12c58c571da7" path="/var/lib/kubelet/pods/294a0599-742d-479d-9758-12c58c571da7/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.648766 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa3596a-fac7-4d92-93fc-4d609fb54513" path="/var/lib/kubelet/pods/2aa3596a-fac7-4d92-93fc-4d609fb54513/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.649385 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122" path="/var/lib/kubelet/pods/3dbdcf06-ed37-4fcf-8f2f-0dfe2501e122/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.650393 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b156df9-8487-4d3c-ae04-53a8ac281484" path="/var/lib/kubelet/pods/4b156df9-8487-4d3c-ae04-53a8ac281484/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.650886 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" path="/var/lib/kubelet/pods/74af5cde-29d3-4ff7-803b-fb335fc8209c/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.651365 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="828472d4-0a60-47d4-906b-107c9cb63417" path="/var/lib/kubelet/pods/828472d4-0a60-47d4-906b-107c9cb63417/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.652352 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="992d93ad-5b93-4369-adec-095082f4da81" path="/var/lib/kubelet/pods/992d93ad-5b93-4369-adec-095082f4da81/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.652907 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6f11e7-3aec-43c4-a35d-13882953a668" path="/var/lib/kubelet/pods/9f6f11e7-3aec-43c4-a35d-13882953a668/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.653605 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78c3705-cf1f-41d5-b80b-323d167e7cba" path="/var/lib/kubelet/pods/a78c3705-cf1f-41d5-b80b-323d167e7cba/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.654539 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8181a75-e25f-462c-90cb-1fddeea8ae6c" path="/var/lib/kubelet/pods/b8181a75-e25f-462c-90cb-1fddeea8ae6c/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.655071 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938" path="/var/lib/kubelet/pods/cbfa6e62-f7b6-44af-87eb-6f6b2c8d6938/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.655557 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb540b8-d57c-4a9e-ba19-52a0ed22cb98" path="/var/lib/kubelet/pods/ccb540b8-d57c-4a9e-ba19-52a0ed22cb98/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.671602 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7263a5a-cae0-4bbb-8429-d4d59d79c63b" path="/var/lib/kubelet/pods/e7263a5a-cae0-4bbb-8429-d4d59d79c63b/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.673217 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6d78be-5cb8-49c0-9cde-4bfae85c513a" path="/var/lib/kubelet/pods/eb6d78be-5cb8-49c0-9cde-4bfae85c513a/volumes" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.679800 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cxmtq"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.679850 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cxmtq"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.679875 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fdfe-account-create-update-s6znl"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.706089 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fdfe-account-create-update-s6znl"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.724203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc616-account-delete-ls8lk" event={"ID":"05b8c157-de2d-4811-a625-1a77c3c7b37b","Type":"ContainerStarted","Data":"74e119881855aa11e997be4d47415f2b08315ba62b5e9a6db9ebd6318b71cac9"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.724370 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbicanc616-account-delete-ls8lk" podUID="05b8c157-de2d-4811-a625-1a77c3c7b37b" containerName="mariadb-account-delete" containerID="cri-o://74e119881855aa11e997be4d47415f2b08315ba62b5e9a6db9ebd6318b71cac9" gracePeriod=30 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.748952 4764 generic.go:334] "Generic (PLEG): container finished" podID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerID="0f1e5405b57025512e61585a9e9a3c74dacc900d7181ee5cacc158e3f86552fc" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.749009 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerDied","Data":"0f1e5405b57025512e61585a9e9a3c74dacc900d7181ee5cacc158e3f86552fc"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.753846 4764 generic.go:334] "Generic (PLEG): container finished" podID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerID="94f6b973870b4eea1c81fd05faef27612f6926a743b37cea298cec8baf77cde6" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.754620 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerDied","Data":"94f6b973870b4eea1c81fd05faef27612f6926a743b37cea298cec8baf77cde6"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.754765 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7ef3ecde-294a-410a-ba90-d08a00674b9f","Type":"ContainerDied","Data":"ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.755086 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffa902d357fac0cc212f91fd781954681391e6e81b94c3625926118e96a64abb" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.760459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement1158-account-delete-dqsj8" event={"ID":"bd7a5353-be52-43e9-9490-530240b943fe","Type":"ContainerStarted","Data":"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.761122 4764 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement1158-account-delete-dqsj8" secret="" err="secret \"galera-openstack-dockercfg-6x254\" not found" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.777224 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.779651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifdfe-account-delete-lncz4" event={"ID":"0fd61cda-9474-470d-aee3-9806975eccaf","Type":"ContainerStarted","Data":"8368d75af9aa569c86737121e680f0a271b2e9342b16b6cf9f892f561cf9669a"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.782553 4764 generic.go:334] "Generic (PLEG): container finished" podID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerID="71b9323307a4b2c17ee25a0cbf4f507e0f54dcd057e06354d3d97cbd5b67d385" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.782600 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerDied","Data":"71b9323307a4b2c17ee25a0cbf4f507e0f54dcd057e06354d3d97cbd5b67d385"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.789964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6","Type":"ContainerDied","Data":"89d7833bf6d408563cb0d7a8f6e57e65de5540cc3cf157d12e68b7c5b666161b"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.790143 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89d7833bf6d408563cb0d7a8f6e57e65de5540cc3cf157d12e68b7c5b666161b" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.792960 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerID="9063bab53ed33482cd89acc542fa22e753f09e7ceb951b53766f7dc55ea3fcda" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.793100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerDied","Data":"9063bab53ed33482cd89acc542fa22e753f09e7ceb951b53766f7dc55ea3fcda"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.795128 4764 generic.go:334] "Generic (PLEG): container finished" podID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerID="bfa789db8eedd550d660743c753b3bea2fab2bce89eb7947314062414fa5026a" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.795242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerDied","Data":"bfa789db8eedd550d660743c753b3bea2fab2bce89eb7947314062414fa5026a"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.796899 4764 generic.go:334] "Generic (PLEG): container finished" podID="7323df53-27cc-46a0-ad81-1e916db379af" containerID="6ef4634d4e9a70890a62dc5bc5ec2d0dea18b5551be672ee6677a592a96cead8" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.796999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerDied","Data":"6ef4634d4e9a70890a62dc5bc5ec2d0dea18b5551be672ee6677a592a96cead8"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.805387 4764 generic.go:334] "Generic (PLEG): container finished" podID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerID="ac966fb4e19027f88b3b69616fbd7358921c916973249c52a6951d8a77e62d9f" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.805572 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"662de035-d0f1-4a65-98ad-161d6f21bd26","Type":"ContainerDied","Data":"ac966fb4e19027f88b3b69616fbd7358921c916973249c52a6951d8a77e62d9f"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.827978 4764 generic.go:334] "Generic (PLEG): container finished" podID="3a7dd687-d272-4102-bc70-199b44353a21" containerID="05700e796c28abd13f4c5635747b2b007e49376c6c97684f76cd88c2347348c6" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.828073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerDied","Data":"05700e796c28abd13f4c5635747b2b007e49376c6c97684f76cd88c2347348c6"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.849852 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-6pvq9"] Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.852976 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:40 crc kubenswrapper[4764]: E1204 00:04:40.853034 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:04:41.353018809 +0000 UTC m=+1417.114343220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.854229 4764 generic.go:334] "Generic (PLEG): container finished" podID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerID="6f3ad0a68a4b98fc593b43b878fbce89575024644243ce672d381c81a0dabf6a" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.854283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerDied","Data":"6f3ad0a68a4b98fc593b43b878fbce89575024644243ce672d381c81a0dabf6a"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.857572 4764 generic.go:334] "Generic (PLEG): container finished" podID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerID="9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.857610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58eedbd8-7bbd-444f-bd11-784c5e7429fa","Type":"ContainerDied","Data":"9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.859103 4764 generic.go:334] "Generic (PLEG): container finished" podID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerID="f01aa606ef44b4da2e845d6aab65dfe450e2d390badf9f1d24c7c03e50b3beb3" exitCode=2 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.859148 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef7e0298-05be-4a37-a1d3-44632ea1d770","Type":"ContainerDied","Data":"f01aa606ef44b4da2e845d6aab65dfe450e2d390badf9f1d24c7c03e50b3beb3"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.859164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ef7e0298-05be-4a37-a1d3-44632ea1d770","Type":"ContainerDied","Data":"28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.859175 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28309304e167a7bceccd8028823e8b3a0de9d42a293cae76ebfec32f2a42cd97" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.860589 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04790-account-delete-g9286" event={"ID":"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa","Type":"ContainerStarted","Data":"21b16aaac1bbdcab2a6868dac32429a999c87640ef00a0fb16cd38015e8b8be6"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.862337 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder2a5d-account-delete-tbzqq" event={"ID":"047e426c-4178-43ce-8a09-ff5b4a6a13f1","Type":"ContainerDied","Data":"9bbc95a4750dd933a869b96126183ba09d4a84e99d7126b4f43c1e98966d06fc"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.862372 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbc95a4750dd933a869b96126183ba09d4a84e99d7126b4f43c1e98966d06fc" Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869046 4764 generic.go:334] "Generic (PLEG): container finished" podID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerID="76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869073 4764 generic.go:334] "Generic (PLEG): container finished" podID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerID="52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81" exitCode=2 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869083 4764 generic.go:334] "Generic (PLEG): container finished" podID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerID="581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869128 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerDied","Data":"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869155 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerDied","Data":"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.869168 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerDied","Data":"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.877013 4764 generic.go:334] "Generic (PLEG): container finished" podID="3cad4f7f-7546-406c-822b-b6f77365d830" containerID="c0f14891d1b59f0d4bb85f831e2a4b7f44911359e51183f14fe60719afd8d989" exitCode=0 Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.877102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerDied","Data":"c0f14891d1b59f0d4bb85f831e2a4b7f44911359e51183f14fe60719afd8d989"} Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.915862 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-6pvq9"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.922173 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4790-account-create-update-dbw9n"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.949160 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.968821 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4790-account-create-update-dbw9n"] Dec 04 00:04:40 crc kubenswrapper[4764]: I1204 00:04:40.993536 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbicanc616-account-delete-ls8lk" podStartSLOduration=6.993508889 podStartE2EDuration="6.993508889s" podCreationTimestamp="2025-12-04 00:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:04:40.742480967 +0000 UTC m=+1416.503805378" watchObservedRunningTime="2025-12-04 00:04:40.993508889 +0000 UTC m=+1416.754833310" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.000849 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement1158-account-delete-dqsj8" podStartSLOduration=7.000814149 podStartE2EDuration="7.000814149s" podCreationTimestamp="2025-12-04 00:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:04:40.786308136 +0000 UTC m=+1416.547632547" watchObservedRunningTime="2025-12-04 00:04:41.000814149 +0000 UTC m=+1416.762138570" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.251199 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.254943 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.266505 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.266552 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.281842 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f9fbbf6f7-znhdg" podUID="e6152d07-38d3-42e7-953f-d9747b1f8996" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: i/o timeout" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.345253 4764 scope.go:117] "RemoveContainer" containerID="6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.345635 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4\": container with ID starting with 6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4 not found: ID does not exist" containerID="6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.345674 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4"} err="failed to get container status \"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4\": rpc error: code = NotFound desc = could not find container \"6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4\": container with ID starting with 6f44bdf9e7e44004f3cbd98ef7ac0c68e15b64057dbfdd4d790f78fc1767ecd4 not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.345702 4764 scope.go:117] "RemoveContainer" containerID="ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.346135 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d\": container with ID starting with ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d not found: ID does not exist" containerID="ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.346152 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d"} err="failed to get container status \"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d\": rpc error: code = NotFound desc = could not find container \"ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d\": container with ID starting with ec9b26f4b26cc74f787655755053be83205ed3cf496ea7b7f28f029c4fc9820d not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.346165 4764 scope.go:117] "RemoveContainer" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.362512 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.362584 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:04:42.362567738 +0000 UTC m=+1418.123892149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.474033 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.495506 4764 scope.go:117] "RemoveContainer" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.497169 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f\": container with ID starting with bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f not found: ID does not exist" containerID="bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.497210 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f"} err="failed to get container status \"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f\": rpc error: code = NotFound desc = could not find container \"bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f\": container with ID starting with bdadd9752af31acf4b665330b82a1da4f606dde827b9e58a1029af3958b5a78f not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.497238 4764 scope.go:117] "RemoveContainer" containerID="c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.539858 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.546615 4764 scope.go:117] "RemoveContainer" containerID="c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.548773 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab\": container with ID starting with c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab not found: ID does not exist" containerID="c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.548934 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab"} err="failed to get container status \"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab\": rpc error: code = NotFound desc = could not find container \"c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab\": container with ID starting with c98e4d0951fbe6c2c1a4af05d4244b4fd9166bff5a44d87c2a191e80e5de38ab not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.549072 4764 scope.go:117] "RemoveContainer" containerID="7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.611700 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.615264 4764 scope.go:117] "RemoveContainer" containerID="3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.629449 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.636125 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669661 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669712 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvnbm\" (UniqueName: \"kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts\") pod \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669908 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j58pv\" (UniqueName: \"kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv\") pod \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\" (UID: \"047e426c-4178-43ce-8a09-ff5b4a6a13f1\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.669998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.670024 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom\") pod \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\" (UID: \"80b73a08-1afd-4f2a-b565-fd8f2b06c0b6\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.670483 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "047e426c-4178-43ce-8a09-ff5b4a6a13f1" (UID: "047e426c-4178-43ce-8a09-ff5b4a6a13f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.671066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.675487 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.676986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm" (OuterVolumeSpecName: "kube-api-access-jvnbm") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "kube-api-access-jvnbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.678986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts" (OuterVolumeSpecName: "scripts") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.679068 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.682867 4764 scope.go:117] "RemoveContainer" containerID="7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.683027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv" (OuterVolumeSpecName: "kube-api-access-j58pv") pod "047e426c-4178-43ce-8a09-ff5b4a6a13f1" (UID: "047e426c-4178-43ce-8a09-ff5b4a6a13f1"). InnerVolumeSpecName "kube-api-access-j58pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.683451 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.684359 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b\": container with ID starting with 7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b not found: ID does not exist" containerID="7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.684398 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b"} err="failed to get container status \"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b\": rpc error: code = NotFound desc = could not find container \"7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b\": container with ID starting with 7ed21ec110f9ad2b41d764bd40cd385c25bb6c05bb35b25503667a8574dd620b not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.684423 4764 scope.go:117] "RemoveContainer" containerID="3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf" Dec 04 00:04:41 crc kubenswrapper[4764]: E1204 00:04:41.685680 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf\": container with ID starting with 3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf not found: ID does not exist" containerID="3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.685705 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf"} err="failed to get container status \"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf\": rpc error: code = NotFound desc = could not find container \"3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf\": container with ID starting with 3ad6a814ed26fc1b251ccbfa0e9cd885fb5f8413fdfbb63714025508158dbddf not found: ID does not exist" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.699004 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.701040 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.701494 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.731020 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.740456 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.759646 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772039 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772485 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs\") pod \"ef7e0298-05be-4a37-a1d3-44632ea1d770\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772519 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772659 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzjcb\" (UniqueName: \"kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5wmg\" (UniqueName: \"kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg\") pod \"ef7e0298-05be-4a37-a1d3-44632ea1d770\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772775 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle\") pod \"ef7e0298-05be-4a37-a1d3-44632ea1d770\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.772923 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs" (OuterVolumeSpecName: "logs") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.773117 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs" (OuterVolumeSpecName: "logs") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.773695 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.773801 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.773872 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs\") pod \"7ef3ecde-294a-410a-ba90-d08a00674b9f\" (UID: \"7ef3ecde-294a-410a-ba90-d08a00674b9f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.773935 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs\") pod \"7323df53-27cc-46a0-ad81-1e916db379af\" (UID: \"7323df53-27cc-46a0-ad81-1e916db379af\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.774029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config\") pod \"ef7e0298-05be-4a37-a1d3-44632ea1d770\" (UID: \"ef7e0298-05be-4a37-a1d3-44632ea1d770\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.784827 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/047e426c-4178-43ce-8a09-ff5b4a6a13f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.781492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts" (OuterVolumeSpecName: "scripts") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.790126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.792140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.793952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j58pv\" (UniqueName: \"kubernetes.io/projected/047e426c-4178-43ce-8a09-ff5b4a6a13f1-kube-api-access-j58pv\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794004 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794035 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323df53-27cc-46a0-ad81-1e916db379af-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794047 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794064 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794076 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.794094 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvnbm\" (UniqueName: \"kubernetes.io/projected/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-kube-api-access-jvnbm\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.795408 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.796862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb" (OuterVolumeSpecName: "kube-api-access-pzjcb") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "kube-api-access-pzjcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.800113 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.812985 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv" (OuterVolumeSpecName: "kube-api-access-p62fv") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "kube-api-access-p62fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.827903 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.844987 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg" (OuterVolumeSpecName: "kube-api-access-v5wmg") pod "ef7e0298-05be-4a37-a1d3-44632ea1d770" (UID: "ef7e0298-05be-4a37-a1d3-44632ea1d770"). InnerVolumeSpecName "kube-api-access-v5wmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.845148 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.876530 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts" (OuterVolumeSpecName: "scripts") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.894906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.894940 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.894980 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrs6s\" (UniqueName: \"kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895028 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895069 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895126 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data\") pod \"3a7dd687-d272-4102-bc70-199b44353a21\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895152 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data\") pod \"662de035-d0f1-4a65-98ad-161d6f21bd26\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895170 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895261 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.895792 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs\") pod \"3a7dd687-d272-4102-bc70-199b44353a21\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs" (OuterVolumeSpecName: "logs") pod "3a7dd687-d272-4102-bc70-199b44353a21" (UID: "3a7dd687-d272-4102-bc70-199b44353a21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896282 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle\") pod \"662de035-d0f1-4a65-98ad-161d6f21bd26\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896347 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle\") pod \"3a7dd687-d272-4102-bc70-199b44353a21\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896386 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom\") pod \"3a7dd687-d272-4102-bc70-199b44353a21\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896431 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896508 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config\") pod \"662de035-d0f1-4a65-98ad-161d6f21bd26\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896527 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896589 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-779cq\" (UniqueName: \"kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896621 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gxhs\" (UniqueName: \"kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs\") pod \"662de035-d0f1-4a65-98ad-161d6f21bd26\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896643 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5rqk\" (UniqueName: \"kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896663 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.896688 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897136 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897157 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzcxx\" (UniqueName: \"kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx\") pod \"3a7dd687-d272-4102-bc70-199b44353a21\" (UID: \"3a7dd687-d272-4102-bc70-199b44353a21\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897204 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data\") pod \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs\") pod \"ae2d9b02-4247-444e-ba56-05d65493dd3e\" (UID: \"ae2d9b02-4247-444e-ba56-05d65493dd3e\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897238 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897257 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897272 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs\") pod \"ec3e74e4-e0bc-45a3-a568-c70087b73572\" (UID: \"ec3e74e4-e0bc-45a3-a568-c70087b73572\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897315 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs\") pod \"3cad4f7f-7546-406c-822b-b6f77365d830\" (UID: \"3cad4f7f-7546-406c-822b-b6f77365d830\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897378 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nn4h\" (UniqueName: \"kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897411 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs\") pod \"662de035-d0f1-4a65-98ad-161d6f21bd26\" (UID: \"662de035-d0f1-4a65-98ad-161d6f21bd26\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.897989 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898003 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p62fv\" (UniqueName: \"kubernetes.io/projected/7ef3ecde-294a-410a-ba90-d08a00674b9f-kube-api-access-p62fv\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898014 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898022 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7323df53-27cc-46a0-ad81-1e916db379af-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898041 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898050 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzjcb\" (UniqueName: \"kubernetes.io/projected/7323df53-27cc-46a0-ad81-1e916db379af-kube-api-access-pzjcb\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898060 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7ef3ecde-294a-410a-ba90-d08a00674b9f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898069 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5wmg\" (UniqueName: \"kubernetes.io/projected/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-api-access-v5wmg\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898077 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.898086 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a7dd687-d272-4102-bc70-199b44353a21-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.901643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs" (OuterVolumeSpecName: "logs") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.908624 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e9218a48-75e0-47ae-a2ac-2d2fa4d08971","Type":"ContainerDied","Data":"ea95e4f5c2b03dfe256bd4671694570f8edb1d8654aac8a540b13306cd682257"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.908675 4764 scope.go:117] "RemoveContainer" containerID="6f3ad0a68a4b98fc593b43b878fbce89575024644243ce672d381c81a0dabf6a" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.908841 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.910057 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs" (OuterVolumeSpecName: "logs") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.911068 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs" (OuterVolumeSpecName: "logs") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.911525 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs" (OuterVolumeSpecName: "logs") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.913685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "662de035-d0f1-4a65-98ad-161d6f21bd26" (UID: "662de035-d0f1-4a65-98ad-161d6f21bd26"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.915710 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data" (OuterVolumeSpecName: "config-data") pod "662de035-d0f1-4a65-98ad-161d6f21bd26" (UID: "662de035-d0f1-4a65-98ad-161d6f21bd26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.916001 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e6de1323-46ca-460b-8a8f-620125ce1d7f","Type":"ContainerDied","Data":"3e3a3f739ffdc57403735e8733cadc11c36b2eaaf8f4279757a54059c4482da6"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.916107 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.918561 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk" (OuterVolumeSpecName: "kube-api-access-t5rqk") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "kube-api-access-t5rqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.918625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifdfe-account-delete-lncz4" event={"ID":"0fd61cda-9474-470d-aee3-9806975eccaf","Type":"ContainerStarted","Data":"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.918971 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapifdfe-account-delete-lncz4" podUID="0fd61cda-9474-470d-aee3-9806975eccaf" containerName="mariadb-account-delete" containerID="cri-o://28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7" gracePeriod=30 Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.919065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs" (OuterVolumeSpecName: "logs") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.921965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance6ca1-account-delete-t8t45" event={"ID":"aa27bce9-febf-497d-ad48-21b087064f34","Type":"ContainerDied","Data":"14d73ef1eb7d96708f245df905b00f90430c0778b19499b78a8419a07a2e0e89"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.921988 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14d73ef1eb7d96708f245df905b00f90430c0778b19499b78a8419a07a2e0e89" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.924352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6d45ff9d86-725zf" event={"ID":"3cad4f7f-7546-406c-822b-b6f77365d830","Type":"ContainerDied","Data":"f0e896d911d0d02d583e2ed95e40887b3c77cf42d7b2ec1e0701283cb3e7858e"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.924431 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6d45ff9d86-725zf" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.936096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h" (OuterVolumeSpecName: "kube-api-access-8nn4h") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "kube-api-access-8nn4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.936263 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.936388 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"58eedbd8-7bbd-444f-bd11-784c5e7429fa","Type":"ContainerDied","Data":"44a0f17b16daf8d4539230161a8b8e19c7c6a3b91b157f26cfa18b87763aa04e"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.936662 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq" (OuterVolumeSpecName: "kube-api-access-779cq") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "kube-api-access-779cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.939055 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a7dd687-d272-4102-bc70-199b44353a21" (UID: "3a7dd687-d272-4102-bc70-199b44353a21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.937357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx" (OuterVolumeSpecName: "kube-api-access-mzcxx") pod "3a7dd687-d272-4102-bc70-199b44353a21" (UID: "3a7dd687-d272-4102-bc70-199b44353a21"). InnerVolumeSpecName "kube-api-access-mzcxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.938497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts" (OuterVolumeSpecName: "scripts") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.938956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.938980 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts" (OuterVolumeSpecName: "scripts") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.939008 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s" (OuterVolumeSpecName: "kube-api-access-jrs6s") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "kube-api-access-jrs6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.941114 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs" (OuterVolumeSpecName: "kube-api-access-9gxhs") pod "662de035-d0f1-4a65-98ad-161d6f21bd26" (UID: "662de035-d0f1-4a65-98ad-161d6f21bd26"). InnerVolumeSpecName "kube-api-access-9gxhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.949958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7323df53-27cc-46a0-ad81-1e916db379af","Type":"ContainerDied","Data":"126a699b48bc29e5e1b7670045d2556f1e8160260049cb5acbd2c1cc87cdacd7"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.950048 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.952083 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.962792 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-789dfd9c8d-k4z96" event={"ID":"ec3e74e4-e0bc-45a3-a568-c70087b73572","Type":"ContainerDied","Data":"a26e0bd88ec5655e1113239cccc30f9087d02c9584f62b06373cecbc27cdf14e"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.962856 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-789dfd9c8d-k4z96" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.964662 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae2d9b02-4247-444e-ba56-05d65493dd3e","Type":"ContainerDied","Data":"7d9ad52eccd57f3dc2c2b526078b1c63f8db3caa34ad8712f276ff67f5b60e3c"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.964745 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.966365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"662de035-d0f1-4a65-98ad-161d6f21bd26","Type":"ContainerDied","Data":"91e20ac143ec056870b8d11cd6b57efb90705751500ca056d7496ec367fef543"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.966405 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.968166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-54ddd476ff-9v8dj" event={"ID":"3a7dd687-d272-4102-bc70-199b44353a21","Type":"ContainerDied","Data":"c60cb3e93740d88d05d5b2fbbcc9fc4a6108cc51b9f9d43b26952458f61d1023"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.968210 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-54ddd476ff-9v8dj" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.970393 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.970537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" event={"ID":"ee8deb66-8364-4d9c-bd17-e4ad937a35e2","Type":"ContainerDied","Data":"c80f1c0b8e23174292e6fc7c7bbfc58b19708fa8cd1754335684ab3e339b096d"} Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.970552 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c80f1c0b8e23174292e6fc7c7bbfc58b19708fa8cd1754335684ab3e339b096d" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.970581 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.970697 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder2a5d-account-delete-tbzqq" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.973863 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.973978 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement1158-account-delete-dqsj8" podUID="bd7a5353-be52-43e9-9490-530240b943fe" containerName="mariadb-account-delete" containerID="cri-o://f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2" gracePeriod=30 Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.999630 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql4gt\" (UniqueName: \"kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt\") pod \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.999661 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle\") pod \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\" (UID: \"58eedbd8-7bbd-444f-bd11-784c5e7429fa\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.999740 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lj9m\" (UniqueName: \"kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:41 crc kubenswrapper[4764]: I1204 00:04:41.999764 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run\") pod \"e6de1323-46ca-460b-8a8f-620125ce1d7f\" (UID: \"e6de1323-46ca-460b-8a8f-620125ce1d7f\") " Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000173 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000188 4764 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000196 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000213 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-779cq\" (UniqueName: \"kubernetes.io/projected/3cad4f7f-7546-406c-822b-b6f77365d830-kube-api-access-779cq\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000287 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gxhs\" (UniqueName: \"kubernetes.io/projected/662de035-d0f1-4a65-98ad-161d6f21bd26-kube-api-access-9gxhs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000297 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5rqk\" (UniqueName: \"kubernetes.io/projected/ae2d9b02-4247-444e-ba56-05d65493dd3e-kube-api-access-t5rqk\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000305 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000324 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000334 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec3e74e4-e0bc-45a3-a568-c70087b73572-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000343 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzcxx\" (UniqueName: \"kubernetes.io/projected/3a7dd687-d272-4102-bc70-199b44353a21-kube-api-access-mzcxx\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000352 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nn4h\" (UniqueName: \"kubernetes.io/projected/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-kube-api-access-8nn4h\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000360 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrs6s\" (UniqueName: \"kubernetes.io/projected/ec3e74e4-e0bc-45a3-a568-c70087b73572-kube-api-access-jrs6s\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000369 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000376 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/662de035-d0f1-4a65-98ad-161d6f21bd26-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000385 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae2d9b02-4247-444e-ba56-05d65493dd3e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000393 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cad4f7f-7546-406c-822b-b6f77365d830-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000404 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.000417 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.003347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.008353 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapifdfe-account-delete-lncz4" podStartSLOduration=7.008334721 podStartE2EDuration="7.008334721s" podCreationTimestamp="2025-12-04 00:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:04:41.983769506 +0000 UTC m=+1417.745093917" watchObservedRunningTime="2025-12-04 00:04:42.008334721 +0000 UTC m=+1417.769659132" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.050884 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt" (OuterVolumeSpecName: "kube-api-access-ql4gt") pod "58eedbd8-7bbd-444f-bd11-784c5e7429fa" (UID: "58eedbd8-7bbd-444f-bd11-784c5e7429fa"). InnerVolumeSpecName "kube-api-access-ql4gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.090695 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m" (OuterVolumeSpecName: "kube-api-access-7lj9m") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "kube-api-access-7lj9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.109965 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql4gt\" (UniqueName: \"kubernetes.io/projected/58eedbd8-7bbd-444f-bd11-784c5e7429fa-kube-api-access-ql4gt\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.109999 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lj9m\" (UniqueName: \"kubernetes.io/projected/e6de1323-46ca-460b-8a8f-620125ce1d7f-kube-api-access-7lj9m\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.110011 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e6de1323-46ca-460b-8a8f-620125ce1d7f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.110089 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.110147 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data podName:bda43f61-31ae-4c4c-967e-f0e8d13f5ae9 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:50.110130338 +0000 UTC m=+1425.871454749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data") pod "rabbitmq-cell1-server-0" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9") : configmap "rabbitmq-cell1-config-data" not found Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.122105 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef7e0298-05be-4a37-a1d3-44632ea1d770" (UID: "ef7e0298-05be-4a37-a1d3-44632ea1d770"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.145905 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data" (OuterVolumeSpecName: "config-data") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.184296 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.186030 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.211280 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.211320 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.211333 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.211346 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.213814 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "662de035-d0f1-4a65-98ad-161d6f21bd26" (UID: "662de035-d0f1-4a65-98ad-161d6f21bd26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.244640 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.271285 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.273038 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c96d99869-mwjrh" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.272087 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.276526 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.278404 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.278491 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.280504 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.281685 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58eedbd8-7bbd-444f-bd11-784c5e7429fa" (UID: "58eedbd8-7bbd-444f-bd11-784c5e7429fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.283483 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.283529 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.311865 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.311999 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") pod \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\" (UID: \"e9218a48-75e0-47ae-a2ac-2d2fa4d08971\") " Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.312470 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.312495 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.312508 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: W1204 00:04:42.312569 4764 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/e9218a48-75e0-47ae-a2ac-2d2fa4d08971/volumes/kubernetes.io~secret/combined-ca-bundle Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.312578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.318174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.322435 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.348902 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.353381 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "ef7e0298-05be-4a37-a1d3-44632ea1d770" (UID: "ef7e0298-05be-4a37-a1d3-44632ea1d770"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.375392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.375582 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e9218a48-75e0-47ae-a2ac-2d2fa4d08971" (UID: "e9218a48-75e0-47ae-a2ac-2d2fa4d08971"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.387881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.398066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414279 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414350 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414363 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414371 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414380 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414706 4764 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414739 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414750 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.414759 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9218a48-75e0-47ae-a2ac-2d2fa4d08971-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.414776 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:42 crc kubenswrapper[4764]: E1204 00:04:42.414836 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:04:44.414819912 +0000 UTC m=+1420.176144323 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.418849 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data" (OuterVolumeSpecName: "config-data") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.424333 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "ef7e0298-05be-4a37-a1d3-44632ea1d770" (UID: "ef7e0298-05be-4a37-a1d3-44632ea1d770"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.447106 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a7dd687-d272-4102-bc70-199b44353a21" (UID: "3a7dd687-d272-4102-bc70-199b44353a21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.456108 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.465187 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "662de035-d0f1-4a65-98ad-161d6f21bd26" (UID: "662de035-d0f1-4a65-98ad-161d6f21bd26"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.479553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data" (OuterVolumeSpecName: "config-data") pod "58eedbd8-7bbd-444f-bd11-784c5e7429fa" (UID: "58eedbd8-7bbd-444f-bd11-784c5e7429fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.485621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.493743 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.499597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data" (OuterVolumeSpecName: "config-data") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520532 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520561 4764 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7e0298-05be-4a37-a1d3-44632ea1d770-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520573 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58eedbd8-7bbd-444f-bd11-784c5e7429fa-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520583 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520591 4764 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/662de035-d0f1-4a65-98ad-161d6f21bd26-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520600 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520609 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520619 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.520645 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.521778 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data" (OuterVolumeSpecName: "config-data") pod "ec3e74e4-e0bc-45a3-a568-c70087b73572" (UID: "ec3e74e4-e0bc-45a3-a568-c70087b73572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.527123 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.527532 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data" (OuterVolumeSpecName: "config-data") pod "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" (UID: "80b73a08-1afd-4f2a-b565-fd8f2b06c0b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.535907 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e6de1323-46ca-460b-8a8f-620125ce1d7f" (UID: "e6de1323-46ca-460b-8a8f-620125ce1d7f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.565313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae2d9b02-4247-444e-ba56-05d65493dd3e" (UID: "ae2d9b02-4247-444e-ba56-05d65493dd3e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.574932 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data" (OuterVolumeSpecName: "config-data") pod "3a7dd687-d272-4102-bc70-199b44353a21" (UID: "3a7dd687-d272-4102-bc70-199b44353a21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.582660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data" (OuterVolumeSpecName: "config-data") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.595227 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6160ab00-1691-41f8-9902-80d33e435770" path="/var/lib/kubelet/pods/6160ab00-1691-41f8-9902-80d33e435770/volumes" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.595945 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8da482-3171-426d-b3ed-41db82605e2a" path="/var/lib/kubelet/pods/6d8da482-3171-426d-b3ed-41db82605e2a/volumes" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.596427 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82499c8-3378-44b1-83f4-db79e6bd190b" path="/var/lib/kubelet/pods/d82499c8-3378-44b1-83f4-db79e6bd190b/volumes" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.597454 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd4ab5b-3e62-4708-8322-424df55d8cf4" path="/var/lib/kubelet/pods/dcd4ab5b-3e62-4708-8322-424df55d8cf4/volumes" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.598465 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f618e733-cb75-4d62-ac56-525007f16fb7" path="/var/lib/kubelet/pods/f618e733-cb75-4d62-ac56-525007f16fb7/volumes" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.606499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data" (OuterVolumeSpecName: "config-data") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.606868 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7323df53-27cc-46a0-ad81-1e916db379af" (UID: "7323df53-27cc-46a0-ad81-1e916db379af"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.612855 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622410 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e6de1323-46ca-460b-8a8f-620125ce1d7f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622448 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622582 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622598 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622609 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae2d9b02-4247-444e-ba56-05d65493dd3e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622620 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622631 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622641 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323df53-27cc-46a0-ad81-1e916db379af-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622652 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a7dd687-d272-4102-bc70-199b44353a21-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.622662 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec3e74e4-e0bc-45a3-a568-c70087b73572-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.636753 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data" (OuterVolumeSpecName: "config-data") pod "7ef3ecde-294a-410a-ba90-d08a00674b9f" (UID: "7ef3ecde-294a-410a-ba90-d08a00674b9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.643283 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.685204 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3cad4f7f-7546-406c-822b-b6f77365d830" (UID: "3cad4f7f-7546-406c-822b-b6f77365d830"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.723964 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ef3ecde-294a-410a-ba90-d08a00674b9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.724000 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.724015 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cad4f7f-7546-406c-822b-b6f77365d830-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.768009 4764 scope.go:117] "RemoveContainer" containerID="506954a3df1b596b6cc009eafe6c0378475f7b74778bfc77c7cd68e0dfd9aa9d" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.986872 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f105a7d8-bb79-4578-98fd-aca60d5ffa10/ovn-northd/0.log" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.986933 4764 generic.go:334] "Generic (PLEG): container finished" podID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" exitCode=139 Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.987029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerDied","Data":"bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.987084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f105a7d8-bb79-4578-98fd-aca60d5ffa10","Type":"ContainerDied","Data":"7888de431fcf4ed90fa1bb9b2d4c56fc3d113862df5a55e1232b18467adc43ef"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.987100 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7888de431fcf4ed90fa1bb9b2d4c56fc3d113862df5a55e1232b18467adc43ef" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.988359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04790-account-delete-g9286" event={"ID":"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa","Type":"ContainerStarted","Data":"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.988425 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell04790-account-delete-g9286" podUID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" containerName="mariadb-account-delete" containerID="cri-o://858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925" gracePeriod=30 Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.995185 4764 generic.go:334] "Generic (PLEG): container finished" podID="d8586564-9024-4375-a5f7-e75844abe723" containerID="40932f1f044feae057b1145cd8eb76e3370493aa44a7c5f0f8b568439dbde7ab" exitCode=0 Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.995250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerDied","Data":"40932f1f044feae057b1145cd8eb76e3370493aa44a7c5f0f8b568439dbde7ab"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.995580 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d8586564-9024-4375-a5f7-e75844abe723","Type":"ContainerDied","Data":"eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.995600 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff8889916235921701c7a104e5d82097e60d31f001e1670e592a1447bf09fc2" Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.997151 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4ef6-account-delete-l854j" event={"ID":"62ce6a22-4796-4b94-9c53-d3088cff26f1","Type":"ContainerStarted","Data":"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c"} Dec 04 00:04:42 crc kubenswrapper[4764]: I1204 00:04:42.997289 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron4ef6-account-delete-l854j" podUID="62ce6a22-4796-4b94-9c53-d3088cff26f1" containerName="mariadb-account-delete" containerID="cri-o://fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c" gracePeriod=30 Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.001219 4764 generic.go:334] "Generic (PLEG): container finished" podID="803d2331-67a9-462d-9e22-09a112264732" containerID="5ba3f5a666e85c1ab0ed9cf5640222917b29d19dae5da32e8c3a64bf079caafd" exitCode=0 Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.001270 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754f454454-nb48r" event={"ID":"803d2331-67a9-462d-9e22-09a112264732","Type":"ContainerDied","Data":"5ba3f5a666e85c1ab0ed9cf5640222917b29d19dae5da32e8c3a64bf079caafd"} Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.014002 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell04790-account-delete-g9286" podStartSLOduration=8.013981738 podStartE2EDuration="8.013981738s" podCreationTimestamp="2025-12-04 00:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:04:43.005988701 +0000 UTC m=+1418.767313122" watchObservedRunningTime="2025-12-04 00:04:43.013981738 +0000 UTC m=+1418.775306149" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.026199 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:43 crc kubenswrapper[4764]: E1204 00:04:43.029381 4764 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 04 00:04:43 crc kubenswrapper[4764]: E1204 00:04:43.029438 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data podName:76708e9b-1db4-42ca-94d2-7ff96d08d855 nodeName:}" failed. No retries permitted until 2025-12-04 00:04:51.029421298 +0000 UTC m=+1426.790745719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data") pod "rabbitmq-server-0" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855") : configmap "rabbitmq-config-data" not found Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.029529 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron4ef6-account-delete-l854j" podStartSLOduration=8.02950531 podStartE2EDuration="8.02950531s" podCreationTimestamp="2025-12-04 00:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 00:04:43.022673362 +0000 UTC m=+1418.783997773" watchObservedRunningTime="2025-12-04 00:04:43.02950531 +0000 UTC m=+1418.790829721" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.056356 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.068193 4764 scope.go:117] "RemoveContainer" containerID="9063bab53ed33482cd89acc542fa22e753f09e7ceb951b53766f7dc55ea3fcda" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.077851 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f105a7d8-bb79-4578-98fd-aca60d5ffa10/ovn-northd/0.log" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.077950 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.089439 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.097073 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.100013 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.122783 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.128547 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.129844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfzxz\" (UniqueName: \"kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz\") pod \"aa27bce9-febf-497d-ad48-21b087064f34\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.129885 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts\") pod \"aa27bce9-febf-497d-ad48-21b087064f34\" (UID: \"aa27bce9-febf-497d-ad48-21b087064f34\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.132372 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa27bce9-febf-497d-ad48-21b087064f34" (UID: "aa27bce9-febf-497d-ad48-21b087064f34"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.135073 4764 scope.go:117] "RemoveContainer" containerID="4d9413685b93b99db1f424005b4c8e8740303642c0922ed2d65d664c8191da2b" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.139372 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz" (OuterVolumeSpecName: "kube-api-access-wfzxz") pod "aa27bce9-febf-497d-ad48-21b087064f34" (UID: "aa27bce9-febf-497d-ad48-21b087064f34"). InnerVolumeSpecName "kube-api-access-wfzxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.144763 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.153159 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.165297 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.189650 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.201823 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.229754 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.231789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.231838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw9x2\" (UniqueName: \"kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232165 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232621 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232658 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psnvn\" (UniqueName: \"kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn\") pod \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232937 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.232954 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom\") pod \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle\") pod \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233158 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data\") pod \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233192 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233325 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs\") pod \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\" (UID: \"ee8deb66-8364-4d9c-bd17-e4ad937a35e2\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.233377 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236065 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl86m\" (UniqueName: \"kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236134 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs\") pod \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\" (UID: \"f105a7d8-bb79-4578-98fd-aca60d5ffa10\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236153 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236172 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated\") pod \"d8586564-9024-4375-a5f7-e75844abe723\" (UID: \"d8586564-9024-4375-a5f7-e75844abe723\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236657 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236670 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfzxz\" (UniqueName: \"kubernetes.io/projected/aa27bce9-febf-497d-ad48-21b087064f34-kube-api-access-wfzxz\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.236679 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa27bce9-febf-497d-ad48-21b087064f34-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.234012 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config" (OuterVolumeSpecName: "config") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.234091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.234553 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts" (OuterVolumeSpecName: "scripts") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.234981 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs" (OuterVolumeSpecName: "logs") pod "ee8deb66-8364-4d9c-bd17-e4ad937a35e2" (UID: "ee8deb66-8364-4d9c-bd17-e4ad937a35e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.235758 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.237734 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.241093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.244485 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2" (OuterVolumeSpecName: "kube-api-access-pw9x2") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "kube-api-access-pw9x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.255572 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn" (OuterVolumeSpecName: "kube-api-access-psnvn") pod "ee8deb66-8364-4d9c-bd17-e4ad937a35e2" (UID: "ee8deb66-8364-4d9c-bd17-e4ad937a35e2"). InnerVolumeSpecName "kube-api-access-psnvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.255682 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee8deb66-8364-4d9c-bd17-e4ad937a35e2" (UID: "ee8deb66-8364-4d9c-bd17-e4ad937a35e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.263392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "mysql-db") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.264954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m" (OuterVolumeSpecName: "kube-api-access-cl86m") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "kube-api-access-cl86m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.265527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.309227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.321840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee8deb66-8364-4d9c-bd17-e4ad937a35e2" (UID: "ee8deb66-8364-4d9c-bd17-e4ad937a35e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.327685 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-789dfd9c8d-k4z96"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.333412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.336054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d8586564-9024-4375-a5f7-e75844abe723" (UID: "d8586564-9024-4375-a5f7-e75844abe723"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340298 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d8586564-9024-4375-a5f7-e75844abe723-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340358 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw9x2\" (UniqueName: \"kubernetes.io/projected/f105a7d8-bb79-4578-98fd-aca60d5ffa10-kube-api-access-pw9x2\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340374 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340393 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psnvn\" (UniqueName: \"kubernetes.io/projected/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-kube-api-access-psnvn\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340427 4764 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340443 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340457 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340477 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340513 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f105a7d8-bb79-4578-98fd-aca60d5ffa10-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340527 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340549 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-logs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340561 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340614 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl86m\" (UniqueName: \"kubernetes.io/projected/d8586564-9024-4375-a5f7-e75844abe723-kube-api-access-cl86m\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340647 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340684 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d8586564-9024-4375-a5f7-e75844abe723-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.340695 4764 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8586564-9024-4375-a5f7-e75844abe723-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.348349 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.194:6080/vnc_lite.html\": dial tcp 10.217.0.194:6080: i/o timeout" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.354149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data" (OuterVolumeSpecName: "config-data") pod "ee8deb66-8364-4d9c-bd17-e4ad937a35e2" (UID: "ee8deb66-8364-4d9c-bd17-e4ad937a35e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.362797 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.370084 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.372454 4764 scope.go:117] "RemoveContainer" containerID="c0f14891d1b59f0d4bb85f831e2a4b7f44911359e51183f14fe60719afd8d989" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.378802 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.390111 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "f105a7d8-bb79-4578-98fd-aca60d5ffa10" (UID: "f105a7d8-bb79-4578-98fd-aca60d5ffa10"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.396083 4764 scope.go:117] "RemoveContainer" containerID="f79070c4af81fb3ba806ca5d2c61d64116e1765e388c3c78534b5f9ef1cd7663" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.396998 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder2a5d-account-delete-tbzqq"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.404287 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.411609 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.419257 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.425640 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.438760 4764 scope.go:117] "RemoveContainer" containerID="9d80510315903831925f0f07ece37532b57c67fdb182877e40648360e6331fe9" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.439831 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6d45ff9d86-725zf"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442768 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442830 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442870 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442906 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhxmv\" (UniqueName: \"kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.442990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443009 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443068 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data\") pod \"803d2331-67a9-462d-9e22-09a112264732\" (UID: \"803d2331-67a9-462d-9e22-09a112264732\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443387 4764 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443405 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee8deb66-8364-4d9c-bd17-e4ad937a35e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443414 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.443426 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f105a7d8-bb79-4578-98fd-aca60d5ffa10-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.445799 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts" (OuterVolumeSpecName: "scripts") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.455142 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv" (OuterVolumeSpecName: "kube-api-access-vhxmv") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "kube-api-access-vhxmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.457275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.462076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.468236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.470934 4764 scope.go:117] "RemoveContainer" containerID="6ef4634d4e9a70890a62dc5bc5ec2d0dea18b5551be672ee6677a592a96cead8" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.474346 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.476098 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.476555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data" (OuterVolumeSpecName: "config-data") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.493991 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.497573 4764 scope.go:117] "RemoveContainer" containerID="4dec83b8647040009bb8b20db48c59cfdae71ee1a3fa1d5ef147201319666a80" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.499155 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.499988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "803d2331-67a9-462d-9e22-09a112264732" (UID: "803d2331-67a9-462d-9e22-09a112264732"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.500644 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.514142 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.517216 4764 scope.go:117] "RemoveContainer" containerID="0f1e5405b57025512e61585a9e9a3c74dacc900d7181ee5cacc158e3f86552fc" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.524137 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.532560 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.542400 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-54ddd476ff-9v8dj"] Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544905 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544931 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544942 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544956 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544968 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544978 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544988 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/803d2331-67a9-462d-9e22-09a112264732-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.544998 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhxmv\" (UniqueName: \"kubernetes.io/projected/803d2331-67a9-462d-9e22-09a112264732-kube-api-access-vhxmv\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.555405 4764 scope.go:117] "RemoveContainer" containerID="5523cc0c69a6274103cea6cdb99c2b0cb069c2b4434f1a09627b39395825d92d" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.577896 4764 scope.go:117] "RemoveContainer" containerID="bfa789db8eedd550d660743c753b3bea2fab2bce89eb7947314062414fa5026a" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.601726 4764 scope.go:117] "RemoveContainer" containerID="32bfd6d3548d7ed75468d396b935e173001398b11d776ac46fbc7e65ee2ad928" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.634478 4764 scope.go:117] "RemoveContainer" containerID="ac966fb4e19027f88b3b69616fbd7358921c916973249c52a6951d8a77e62d9f" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.658784 4764 scope.go:117] "RemoveContainer" containerID="05700e796c28abd13f4c5635747b2b007e49376c6c97684f76cd88c2347348c6" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.676194 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.686662 4764 scope.go:117] "RemoveContainer" containerID="da01bc5b68c5d213728724e481b6998adc831fb8b0fab47de60acb606240ad85" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.712160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747364 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747422 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzvl4\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747483 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldxds\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747512 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747606 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747675 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747708 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747835 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747880 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747928 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins\") pod \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\" (UID: \"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.747956 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins\") pod \"76708e9b-1db4-42ca-94d2-7ff96d08d855\" (UID: \"76708e9b-1db4-42ca-94d2-7ff96d08d855\") " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.748616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.749044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.749471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.749491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.750082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.751470 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds" (OuterVolumeSpecName: "kube-api-access-ldxds") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "kube-api-access-ldxds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.751769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.753258 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.753359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.755827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.756140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.760558 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4" (OuterVolumeSpecName: "kube-api-access-jzvl4") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "kube-api-access-jzvl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.761682 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.766932 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info" (OuterVolumeSpecName: "pod-info") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.781954 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.783397 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info" (OuterVolumeSpecName: "pod-info") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.798528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data" (OuterVolumeSpecName: "config-data") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.806153 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data" (OuterVolumeSpecName: "config-data") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.817140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf" (OuterVolumeSpecName: "server-conf") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850123 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850154 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850167 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850200 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850402 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850415 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850433 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzvl4\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-kube-api-access-jzvl4\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850444 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldxds\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-kube-api-access-ldxds\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850477 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850490 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850501 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850512 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/76708e9b-1db4-42ca-94d2-7ff96d08d855-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850523 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850534 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/76708e9b-1db4-42ca-94d2-7ff96d08d855-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850553 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850564 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850573 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/76708e9b-1db4-42ca-94d2-7ff96d08d855-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850583 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.850593 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.875342 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.904631 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.904963 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "76708e9b-1db4-42ca-94d2-7ff96d08d855" (UID: "76708e9b-1db4-42ca-94d2-7ff96d08d855"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.914351 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf" (OuterVolumeSpecName: "server-conf") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.923404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" (UID: "bda43f61-31ae-4c4c-967e-f0e8d13f5ae9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.952198 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.952239 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/76708e9b-1db4-42ca-94d2-7ff96d08d855-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.952252 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.952262 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:43 crc kubenswrapper[4764]: I1204 00:04:43.952271 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.017591 4764 generic.go:334] "Generic (PLEG): container finished" podID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerID="0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf" exitCode=0 Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.017644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerDied","Data":"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf"} Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.017668 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda43f61-31ae-4c4c-967e-f0e8d13f5ae9","Type":"ContainerDied","Data":"2fb8fe5fe4b5dd4640b6f9d111a943406eeb12c609f69dcc9c26a2e78dd0f3f3"} Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.018131 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.019472 4764 scope.go:117] "RemoveContainer" containerID="0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.022443 4764 generic.go:334] "Generic (PLEG): container finished" podID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerID="699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972" exitCode=0 Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.022541 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.023082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerDied","Data":"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972"} Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.023116 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"76708e9b-1db4-42ca-94d2-7ff96d08d855","Type":"ContainerDied","Data":"79b05697b0109953e848aa512e8920f7fd0c8e0fec85799274334d68e47f1f37"} Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.043654 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5869cb876-lfmmz" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.044028 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-754f454454-nb48r" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.046683 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.049969 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.050828 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance6ca1-account-delete-t8t45" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.052087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-754f454454-nb48r" event={"ID":"803d2331-67a9-462d-9e22-09a112264732","Type":"ContainerDied","Data":"80b91bd99d8534b3de266d2b1715863f161532c53aa2dac7499707c78064a3ef"} Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.067306 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.043217 4764 scope.go:117] "RemoveContainer" containerID="6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.078191 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.103853 4764 scope.go:117] "RemoveContainer" containerID="0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf" Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.104684 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf\": container with ID starting with 0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf not found: ID does not exist" containerID="0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.104854 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf"} err="failed to get container status \"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf\": rpc error: code = NotFound desc = could not find container \"0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf\": container with ID starting with 0f89a7ff838ef02b938c5ec8480fac7046de7baf7b243f18713ab2d5a8c0c7bf not found: ID does not exist" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.104954 4764 scope.go:117] "RemoveContainer" containerID="6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11" Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.108054 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11\": container with ID starting with 6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11 not found: ID does not exist" containerID="6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.120665 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11"} err="failed to get container status \"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11\": rpc error: code = NotFound desc = could not find container \"6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11\": container with ID starting with 6aba4fe590b2568bd12e25baeaaafb33abd68f7cb76cc9e8c5cc89dfe118ac11 not found: ID does not exist" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.120724 4764 scope.go:117] "RemoveContainer" containerID="699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.120882 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.129533 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.138642 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.165313 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5869cb876-lfmmz"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.184833 4764 scope.go:117] "RemoveContainer" containerID="309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.186760 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.194743 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-754f454454-nb48r"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.201472 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.209610 4764 scope.go:117] "RemoveContainer" containerID="699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972" Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.210094 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972\": container with ID starting with 699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972 not found: ID does not exist" containerID="699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.210146 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972"} err="failed to get container status \"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972\": rpc error: code = NotFound desc = could not find container \"699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972\": container with ID starting with 699327ce2e04deff64b7b5f6bcb0af8bfac7aa555a8d6374f45df2f7408e6972 not found: ID does not exist" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.210180 4764 scope.go:117] "RemoveContainer" containerID="309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0" Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.210671 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0\": container with ID starting with 309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0 not found: ID does not exist" containerID="309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.210710 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0"} err="failed to get container status \"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0\": rpc error: code = NotFound desc = could not find container \"309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0\": container with ID starting with 309e1e3927437beccfecc7130de865a7282f9b153a7ad00b85d39d985e283fc0 not found: ID does not exist" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.210792 4764 scope.go:117] "RemoveContainer" containerID="5ba3f5a666e85c1ab0ed9cf5640222917b29d19dae5da32e8c3a64bf079caafd" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.215489 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance6ca1-account-delete-t8t45"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.221770 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.244944 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.257991 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.265680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.476340 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:44 crc kubenswrapper[4764]: E1204 00:04:44.476425 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:04:48.476406415 +0000 UTC m=+1424.237730846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.560126 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047e426c-4178-43ce-8a09-ff5b4a6a13f1" path="/var/lib/kubelet/pods/047e426c-4178-43ce-8a09-ff5b4a6a13f1/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.560693 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a7dd687-d272-4102-bc70-199b44353a21" path="/var/lib/kubelet/pods/3a7dd687-d272-4102-bc70-199b44353a21/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.561305 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" path="/var/lib/kubelet/pods/3cad4f7f-7546-406c-822b-b6f77365d830/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.562265 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" path="/var/lib/kubelet/pods/58eedbd8-7bbd-444f-bd11-784c5e7429fa/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.562818 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" path="/var/lib/kubelet/pods/662de035-d0f1-4a65-98ad-161d6f21bd26/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.563307 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7323df53-27cc-46a0-ad81-1e916db379af" path="/var/lib/kubelet/pods/7323df53-27cc-46a0-ad81-1e916db379af/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.564599 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" path="/var/lib/kubelet/pods/76708e9b-1db4-42ca-94d2-7ff96d08d855/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.565222 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" path="/var/lib/kubelet/pods/7ef3ecde-294a-410a-ba90-d08a00674b9f/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.566529 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803d2331-67a9-462d-9e22-09a112264732" path="/var/lib/kubelet/pods/803d2331-67a9-462d-9e22-09a112264732/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.566997 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" path="/var/lib/kubelet/pods/80b73a08-1afd-4f2a-b565-fd8f2b06c0b6/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.567539 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa27bce9-febf-497d-ad48-21b087064f34" path="/var/lib/kubelet/pods/aa27bce9-febf-497d-ad48-21b087064f34/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.568467 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" path="/var/lib/kubelet/pods/ae2d9b02-4247-444e-ba56-05d65493dd3e/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.569254 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" path="/var/lib/kubelet/pods/bda43f61-31ae-4c4c-967e-f0e8d13f5ae9/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.569934 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8586564-9024-4375-a5f7-e75844abe723" path="/var/lib/kubelet/pods/d8586564-9024-4375-a5f7-e75844abe723/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.577854 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" path="/var/lib/kubelet/pods/e6de1323-46ca-460b-8a8f-620125ce1d7f/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.578992 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" path="/var/lib/kubelet/pods/e9218a48-75e0-47ae-a2ac-2d2fa4d08971/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.580413 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" path="/var/lib/kubelet/pods/ec3e74e4-e0bc-45a3-a568-c70087b73572/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.581295 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" path="/var/lib/kubelet/pods/ee8deb66-8364-4d9c-bd17-e4ad937a35e2/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.582093 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" path="/var/lib/kubelet/pods/ef7e0298-05be-4a37-a1d3-44632ea1d770/volumes" Dec 04 00:04:44 crc kubenswrapper[4764]: I1204 00:04:44.583545 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" path="/var/lib/kubelet/pods/f105a7d8-bb79-4578-98fd-aca60d5ffa10/volumes" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.852083 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898767 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898864 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898891 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898955 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m57x\" (UniqueName: \"kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.898998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.899050 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.899099 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd\") pod \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\" (UID: \"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6\") " Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.899861 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.902113 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.903302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x" (OuterVolumeSpecName: "kube-api-access-6m57x") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "kube-api-access-6m57x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.917075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts" (OuterVolumeSpecName: "scripts") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.923458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.954096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.977866 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:45 crc kubenswrapper[4764]: I1204 00:04:45.992027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data" (OuterVolumeSpecName: "config-data") pod "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" (UID: "5514d84f-88e9-4b13-9b5d-1c17c15fc0b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001861 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001914 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001926 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001939 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001950 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m57x\" (UniqueName: \"kubernetes.io/projected/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-kube-api-access-6m57x\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001962 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001972 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.001984 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.067380 4764 generic.go:334] "Generic (PLEG): container finished" podID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerID="6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db" exitCode=0 Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.067430 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerDied","Data":"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db"} Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.067456 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.067465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5514d84f-88e9-4b13-9b5d-1c17c15fc0b6","Type":"ContainerDied","Data":"77d435b07ebd0f55f4c323c1297c8886838f47b04c073112c54fd6cb080e76cb"} Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.067487 4764 scope.go:117] "RemoveContainer" containerID="76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.099816 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.102586 4764 scope.go:117] "RemoveContainer" containerID="52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.105021 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.132361 4764 scope.go:117] "RemoveContainer" containerID="6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.151270 4764 scope.go:117] "RemoveContainer" containerID="581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.186461 4764 scope.go:117] "RemoveContainer" containerID="76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7" Dec 04 00:04:46 crc kubenswrapper[4764]: E1204 00:04:46.187448 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7\": container with ID starting with 76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7 not found: ID does not exist" containerID="76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.187483 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7"} err="failed to get container status \"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7\": rpc error: code = NotFound desc = could not find container \"76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7\": container with ID starting with 76a528472193373fb0f507bb5ac10049475267e244a6b83e5f17c0325cccb6d7 not found: ID does not exist" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.187508 4764 scope.go:117] "RemoveContainer" containerID="52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81" Dec 04 00:04:46 crc kubenswrapper[4764]: E1204 00:04:46.187952 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81\": container with ID starting with 52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81 not found: ID does not exist" containerID="52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.187983 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81"} err="failed to get container status \"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81\": rpc error: code = NotFound desc = could not find container \"52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81\": container with ID starting with 52128c7014471be1b90db9431d72ddfa069ff976adb002f72a589f6e00f10c81 not found: ID does not exist" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.188008 4764 scope.go:117] "RemoveContainer" containerID="6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db" Dec 04 00:04:46 crc kubenswrapper[4764]: E1204 00:04:46.188213 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db\": container with ID starting with 6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db not found: ID does not exist" containerID="6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.188238 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db"} err="failed to get container status \"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db\": rpc error: code = NotFound desc = could not find container \"6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db\": container with ID starting with 6dbb9935b2a76a679bc7a454459f9e329ac70be1293653a363524d1bf3ab71db not found: ID does not exist" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.188256 4764 scope.go:117] "RemoveContainer" containerID="581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93" Dec 04 00:04:46 crc kubenswrapper[4764]: E1204 00:04:46.188529 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93\": container with ID starting with 581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93 not found: ID does not exist" containerID="581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.188554 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93"} err="failed to get container status \"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93\": rpc error: code = NotFound desc = could not find container \"581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93\": container with ID starting with 581c5358df957df7bfa448150600de3d4d5d3cd728d9699c552f33fef1094f93 not found: ID does not exist" Dec 04 00:04:46 crc kubenswrapper[4764]: I1204 00:04:46.558778 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" path="/var/lib/kubelet/pods/5514d84f-88e9-4b13-9b5d-1c17c15fc0b6/volumes" Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.261282 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.262481 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.263029 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.263028 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.263099 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.264223 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.265832 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:47 crc kubenswrapper[4764]: E1204 00:04:47.265868 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:04:48 crc kubenswrapper[4764]: E1204 00:04:48.547290 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:48 crc kubenswrapper[4764]: E1204 00:04:48.547380 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:04:56.547362342 +0000 UTC m=+1432.308686753 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:50 crc kubenswrapper[4764]: I1204 00:04:50.869303 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:04:50 crc kubenswrapper[4764]: I1204 00:04:50.869695 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.261106 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.261746 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.262377 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.262445 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.262991 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.266834 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.270566 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:52 crc kubenswrapper[4764]: E1204 00:04:52.270619 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:04:56 crc kubenswrapper[4764]: E1204 00:04:56.582154 4764 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 04 00:04:56 crc kubenswrapper[4764]: E1204 00:04:56.582252 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts podName:bd7a5353-be52-43e9-9490-530240b943fe nodeName:}" failed. No retries permitted until 2025-12-04 00:05:12.582230989 +0000 UTC m=+1448.343555440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts") pod "placement1158-account-delete-dqsj8" (UID: "bd7a5353-be52-43e9-9490-530240b943fe") : configmap "openstack-scripts" not found Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.260896 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.261459 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.261877 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.261909 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.262159 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.266185 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.268487 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:04:57 crc kubenswrapper[4764]: E1204 00:04:57.268540 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.790875 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.852306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.852429 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.853580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnz5c\" (UniqueName: \"kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.854463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.854536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.854574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.854626 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle\") pod \"8499c909-53fe-4742-aa11-29e214451689\" (UID: \"8499c909-53fe-4742-aa11-29e214451689\") " Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.857607 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c" (OuterVolumeSpecName: "kube-api-access-cnz5c") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "kube-api-access-cnz5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.857815 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.902076 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config" (OuterVolumeSpecName: "config") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.908666 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.912047 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.918536 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.932586 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8499c909-53fe-4742-aa11-29e214451689" (UID: "8499c909-53fe-4742-aa11-29e214451689"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956558 4764 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956591 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnz5c\" (UniqueName: \"kubernetes.io/projected/8499c909-53fe-4742-aa11-29e214451689-kube-api-access-cnz5c\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956602 4764 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956612 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956621 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956629 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:58 crc kubenswrapper[4764]: I1204 00:04:58.956637 4764 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8499c909-53fe-4742-aa11-29e214451689-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.214487 4764 generic.go:334] "Generic (PLEG): container finished" podID="8499c909-53fe-4742-aa11-29e214451689" containerID="f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3" exitCode=0 Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.214600 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c96d99869-mwjrh" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.214601 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerDied","Data":"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3"} Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.215194 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c96d99869-mwjrh" event={"ID":"8499c909-53fe-4742-aa11-29e214451689","Type":"ContainerDied","Data":"82cae38fafe86fe46c4d0042fcfce2c20fa3928de79e4db413edba7a96a701a9"} Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.215246 4764 scope.go:117] "RemoveContainer" containerID="a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.248030 4764 scope.go:117] "RemoveContainer" containerID="f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.266852 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.276046 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c96d99869-mwjrh"] Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.282379 4764 scope.go:117] "RemoveContainer" containerID="a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c" Dec 04 00:04:59 crc kubenswrapper[4764]: E1204 00:04:59.282877 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c\": container with ID starting with a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c not found: ID does not exist" containerID="a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.282916 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c"} err="failed to get container status \"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c\": rpc error: code = NotFound desc = could not find container \"a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c\": container with ID starting with a7a6a7ab85896082aaeb8f6a681402b4d89863386c4458eabef7ff5c6aae520c not found: ID does not exist" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.282947 4764 scope.go:117] "RemoveContainer" containerID="f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3" Dec 04 00:04:59 crc kubenswrapper[4764]: E1204 00:04:59.283376 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3\": container with ID starting with f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3 not found: ID does not exist" containerID="f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3" Dec 04 00:04:59 crc kubenswrapper[4764]: I1204 00:04:59.283401 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3"} err="failed to get container status \"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3\": rpc error: code = NotFound desc = could not find container \"f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3\": container with ID starting with f0b64eca80d22208d290b42b72195a8ff6de21063e2e9c96091e5437fe2184e3 not found: ID does not exist" Dec 04 00:05:00 crc kubenswrapper[4764]: I1204 00:05:00.562499 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8499c909-53fe-4742-aa11-29e214451689" path="/var/lib/kubelet/pods/8499c909-53fe-4742-aa11-29e214451689/volumes" Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.262313 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.264107 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.264134 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.264687 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.264800 4764 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.265912 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.268791 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 04 00:05:02 crc kubenswrapper[4764]: E1204 00:05:02.268857 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-xnsqq" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.311661 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xnsqq_9f850034-7f6e-4811-b98f-89648c559dcd/ovs-vswitchd/0.log" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.311758 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-xnsqq_9f850034-7f6e-4811-b98f-89648c559dcd/ovs-vswitchd/0.log" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.313459 4764 generic.go:334] "Generic (PLEG): container finished" podID="9f850034-7f6e-4811-b98f-89648c559dcd" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" exitCode=137 Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.313491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerDied","Data":"2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1"} Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.313510 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-xnsqq" event={"ID":"9f850034-7f6e-4811-b98f-89648c559dcd","Type":"ContainerDied","Data":"de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5"} Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.313520 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de92fdc23ff22f26de018d9ed7d3f9f3c88305ce541b66ece2c46ec79fd069f5" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.313628 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xnsqq" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.377911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378083 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378106 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378135 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run" (OuterVolumeSpecName: "var-run") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378163 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378184 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log" (OuterVolumeSpecName: "var-log") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7tbw\" (UniqueName: \"kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378248 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib\") pod \"9f850034-7f6e-4811-b98f-89648c559dcd\" (UID: \"9f850034-7f6e-4811-b98f-89648c559dcd\") " Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib" (OuterVolumeSpecName: "var-lib") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378540 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378558 4764 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378569 4764 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.378579 4764 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9f850034-7f6e-4811-b98f-89648c559dcd-var-lib\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.379558 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts" (OuterVolumeSpecName: "scripts") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.392896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw" (OuterVolumeSpecName: "kube-api-access-q7tbw") pod "9f850034-7f6e-4811-b98f-89648c559dcd" (UID: "9f850034-7f6e-4811-b98f-89648c559dcd"). InnerVolumeSpecName "kube-api-access-q7tbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.479517 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7tbw\" (UniqueName: \"kubernetes.io/projected/9f850034-7f6e-4811-b98f-89648c559dcd-kube-api-access-q7tbw\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.479540 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f850034-7f6e-4811-b98f-89648c559dcd-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:06 crc kubenswrapper[4764]: I1204 00:05:06.977793 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.087534 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock\") pod \"1691fb5b-c57a-4773-9710-347c99bd9712\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.087959 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1691fb5b-c57a-4773-9710-347c99bd9712\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088038 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") pod \"1691fb5b-c57a-4773-9710-347c99bd9712\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w2hg\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg\") pod \"1691fb5b-c57a-4773-9710-347c99bd9712\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache\") pod \"1691fb5b-c57a-4773-9710-347c99bd9712\" (UID: \"1691fb5b-c57a-4773-9710-347c99bd9712\") " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088196 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock" (OuterVolumeSpecName: "lock") pod "1691fb5b-c57a-4773-9710-347c99bd9712" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088629 4764 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-lock\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.088970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache" (OuterVolumeSpecName: "cache") pod "1691fb5b-c57a-4773-9710-347c99bd9712" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.090985 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg" (OuterVolumeSpecName: "kube-api-access-9w2hg") pod "1691fb5b-c57a-4773-9710-347c99bd9712" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712"). InnerVolumeSpecName "kube-api-access-9w2hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.091217 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1691fb5b-c57a-4773-9710-347c99bd9712" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.091568 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "swift") pod "1691fb5b-c57a-4773-9710-347c99bd9712" (UID: "1691fb5b-c57a-4773-9710-347c99bd9712"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.190069 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.190112 4764 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.190125 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w2hg\" (UniqueName: \"kubernetes.io/projected/1691fb5b-c57a-4773-9710-347c99bd9712-kube-api-access-9w2hg\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.190135 4764 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1691fb5b-c57a-4773-9710-347c99bd9712-cache\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.216702 4764 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.292043 4764 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.328927 4764 generic.go:334] "Generic (PLEG): container finished" podID="1691fb5b-c57a-4773-9710-347c99bd9712" containerID="65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415" exitCode=137 Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.328992 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415"} Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.329033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1691fb5b-c57a-4773-9710-347c99bd9712","Type":"ContainerDied","Data":"62535ef31ad81bf101df72585b98d5c55b463ae8889b980ee0b9e4ef0c4b917e"} Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.329088 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-xnsqq" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.329104 4764 scope.go:117] "RemoveContainer" containerID="65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.329191 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.356452 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.362012 4764 scope.go:117] "RemoveContainer" containerID="a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.365819 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-xnsqq"] Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.390079 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.393483 4764 scope.go:117] "RemoveContainer" containerID="79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.400227 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.408599 4764 scope.go:117] "RemoveContainer" containerID="bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.422787 4764 scope.go:117] "RemoveContainer" containerID="b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.445229 4764 scope.go:117] "RemoveContainer" containerID="3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.461953 4764 scope.go:117] "RemoveContainer" containerID="db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.482336 4764 scope.go:117] "RemoveContainer" containerID="6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.501293 4764 scope.go:117] "RemoveContainer" containerID="5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.522000 4764 scope.go:117] "RemoveContainer" containerID="9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.552983 4764 scope.go:117] "RemoveContainer" containerID="8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.579057 4764 scope.go:117] "RemoveContainer" containerID="49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.601211 4764 scope.go:117] "RemoveContainer" containerID="bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.625843 4764 scope.go:117] "RemoveContainer" containerID="ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.647194 4764 scope.go:117] "RemoveContainer" containerID="229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.669553 4764 scope.go:117] "RemoveContainer" containerID="65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.670126 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415\": container with ID starting with 65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415 not found: ID does not exist" containerID="65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.670212 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415"} err="failed to get container status \"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415\": rpc error: code = NotFound desc = could not find container \"65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415\": container with ID starting with 65a64f5d9f9bacd2d1dd9434579eb89e30e1bc3c6dbfc10cf864c6398db18415 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.670267 4764 scope.go:117] "RemoveContainer" containerID="a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.670781 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a\": container with ID starting with a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a not found: ID does not exist" containerID="a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.670816 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a"} err="failed to get container status \"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a\": rpc error: code = NotFound desc = could not find container \"a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a\": container with ID starting with a3705cb87bbbd7b58371aa155d796fa57616161aa4bcbdc1687838a1a4a9982a not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.670844 4764 scope.go:117] "RemoveContainer" containerID="79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.671259 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52\": container with ID starting with 79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52 not found: ID does not exist" containerID="79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.671332 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52"} err="failed to get container status \"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52\": rpc error: code = NotFound desc = could not find container \"79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52\": container with ID starting with 79d487080d56119fa9d863ee60da56fc479a3ad9b0da2c40e96dd64e69eafd52 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.671372 4764 scope.go:117] "RemoveContainer" containerID="bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.671757 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372\": container with ID starting with bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372 not found: ID does not exist" containerID="bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.671792 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372"} err="failed to get container status \"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372\": rpc error: code = NotFound desc = could not find container \"bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372\": container with ID starting with bb4b732729b69ba5767d89877474e9d03e7762b0a868b771c768590f17771372 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.671811 4764 scope.go:117] "RemoveContainer" containerID="b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.672164 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a\": container with ID starting with b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a not found: ID does not exist" containerID="b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.672227 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a"} err="failed to get container status \"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a\": rpc error: code = NotFound desc = could not find container \"b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a\": container with ID starting with b4681ca0d6b5e09c77906c446be1f7591cdd1c73bd87d192191df005e8e1e52a not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.672267 4764 scope.go:117] "RemoveContainer" containerID="3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.672606 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566\": container with ID starting with 3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566 not found: ID does not exist" containerID="3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.672638 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566"} err="failed to get container status \"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566\": rpc error: code = NotFound desc = could not find container \"3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566\": container with ID starting with 3b30ae385e319b94dfd7af3bd92cdabcfd3a37937783200788bc76482d1ac566 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.672658 4764 scope.go:117] "RemoveContainer" containerID="db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.673030 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9\": container with ID starting with db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9 not found: ID does not exist" containerID="db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.673059 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9"} err="failed to get container status \"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9\": rpc error: code = NotFound desc = could not find container \"db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9\": container with ID starting with db132801c03841be2751ea72c5796ead30192661aa76bba2a713c24dc31d1dc9 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.673075 4764 scope.go:117] "RemoveContainer" containerID="6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.673697 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6\": container with ID starting with 6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6 not found: ID does not exist" containerID="6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.673758 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6"} err="failed to get container status \"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6\": rpc error: code = NotFound desc = could not find container \"6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6\": container with ID starting with 6b0529bbe68ca4a60aa30c664700c1f57728513ad1cdb5686baeb0acfc1aa2b6 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.673776 4764 scope.go:117] "RemoveContainer" containerID="5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.674176 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf\": container with ID starting with 5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf not found: ID does not exist" containerID="5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.674202 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf"} err="failed to get container status \"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf\": rpc error: code = NotFound desc = could not find container \"5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf\": container with ID starting with 5eda900178d17c37fd4ee20478d838637411874f73c2d9feadedecdd50e0bdaf not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.674218 4764 scope.go:117] "RemoveContainer" containerID="9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.674605 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe\": container with ID starting with 9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe not found: ID does not exist" containerID="9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.674631 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe"} err="failed to get container status \"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe\": rpc error: code = NotFound desc = could not find container \"9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe\": container with ID starting with 9a68cafee2ce944f6cb804070c7f72d4a4daaab1a64edb69da978962aeb50ebe not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.674647 4764 scope.go:117] "RemoveContainer" containerID="8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.675316 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8\": container with ID starting with 8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8 not found: ID does not exist" containerID="8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.675343 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8"} err="failed to get container status \"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8\": rpc error: code = NotFound desc = could not find container \"8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8\": container with ID starting with 8ed2fb0f81018412637e850627841666310a63cd36b6bc3a4960e30ff376f4e8 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.675362 4764 scope.go:117] "RemoveContainer" containerID="49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.675874 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80\": container with ID starting with 49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80 not found: ID does not exist" containerID="49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.675899 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80"} err="failed to get container status \"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80\": rpc error: code = NotFound desc = could not find container \"49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80\": container with ID starting with 49ffafefa8b390745b2658ba1e947830e40d506879b55f08d859b8fd9c686f80 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.675915 4764 scope.go:117] "RemoveContainer" containerID="bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.676342 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6\": container with ID starting with bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6 not found: ID does not exist" containerID="bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.676381 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6"} err="failed to get container status \"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6\": rpc error: code = NotFound desc = could not find container \"bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6\": container with ID starting with bd1143e029c8a37204ba07d07f493135e74d825aa579f56afe7d8cb3a76c50e6 not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.676413 4764 scope.go:117] "RemoveContainer" containerID="ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.676815 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc\": container with ID starting with ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc not found: ID does not exist" containerID="ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.676882 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc"} err="failed to get container status \"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc\": rpc error: code = NotFound desc = could not find container \"ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc\": container with ID starting with ce6a8cfae43b3553d29f91365107ef87f918d2103f1ec7719468f8ecfc98c3cc not found: ID does not exist" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.676912 4764 scope.go:117] "RemoveContainer" containerID="229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160" Dec 04 00:05:07 crc kubenswrapper[4764]: E1204 00:05:07.677404 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160\": container with ID starting with 229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160 not found: ID does not exist" containerID="229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160" Dec 04 00:05:07 crc kubenswrapper[4764]: I1204 00:05:07.677433 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160"} err="failed to get container status \"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160\": rpc error: code = NotFound desc = could not find container \"229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160\": container with ID starting with 229731922be5f6c99ed01e9c00bcd031539cc6465ef08a782d43d01e3d031160 not found: ID does not exist" Dec 04 00:05:08 crc kubenswrapper[4764]: I1204 00:05:08.557098 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" path="/var/lib/kubelet/pods/1691fb5b-c57a-4773-9710-347c99bd9712/volumes" Dec 04 00:05:08 crc kubenswrapper[4764]: I1204 00:05:08.559035 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" path="/var/lib/kubelet/pods/9f850034-7f6e-4811-b98f-89648c559dcd/volumes" Dec 04 00:05:09 crc kubenswrapper[4764]: I1204 00:05:09.088363 4764 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8291acae-68d4-4e14-b0a7-40d026ff1cb2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8291acae-68d4-4e14-b0a7-40d026ff1cb2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8291acae_68d4_4e14_b0a7_40d026ff1cb2.slice" Dec 04 00:05:09 crc kubenswrapper[4764]: E1204 00:05:09.088432 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod8291acae-68d4-4e14-b0a7-40d026ff1cb2] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod8291acae-68d4-4e14-b0a7-40d026ff1cb2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8291acae_68d4_4e14_b0a7_40d026ff1cb2.slice" pod="openstack/openstackclient" podUID="8291acae-68d4-4e14-b0a7-40d026ff1cb2" Dec 04 00:05:09 crc kubenswrapper[4764]: I1204 00:05:09.352771 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.337444 4764 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod155f4570-7769-42ab-8bc0-168dba070531"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod155f4570-7769-42ab-8bc0-168dba070531] : Timed out while waiting for systemd to remove kubepods-besteffort-pod155f4570_7769_42ab_8bc0_168dba070531.slice" Dec 04 00:05:11 crc kubenswrapper[4764]: E1204 00:05:11.337921 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod155f4570-7769-42ab-8bc0-168dba070531] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod155f4570-7769-42ab-8bc0-168dba070531] : Timed out while waiting for systemd to remove kubepods-besteffort-pod155f4570_7769_42ab_8bc0_168dba070531.slice" pod="openstack/nova-cell1-conductor-0" podUID="155f4570-7769-42ab-8bc0-168dba070531" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.373118 4764 generic.go:334] "Generic (PLEG): container finished" podID="05b8c157-de2d-4811-a625-1a77c3c7b37b" containerID="74e119881855aa11e997be4d47415f2b08315ba62b5e9a6db9ebd6318b71cac9" exitCode=137 Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.373243 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.374058 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc616-account-delete-ls8lk" event={"ID":"05b8c157-de2d-4811-a625-1a77c3c7b37b","Type":"ContainerDied","Data":"74e119881855aa11e997be4d47415f2b08315ba62b5e9a6db9ebd6318b71cac9"} Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.435275 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.445100 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.469059 4764 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod62d2ebe6-a49b-4835-bac7-86fbf33bd6c7"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod62d2ebe6-a49b-4835-bac7-86fbf33bd6c7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod62d2ebe6_a49b_4835_bac7_86fbf33bd6c7.slice" Dec 04 00:05:11 crc kubenswrapper[4764]: E1204 00:05:11.469105 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod62d2ebe6-a49b-4835-bac7-86fbf33bd6c7] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod62d2ebe6-a49b-4835-bac7-86fbf33bd6c7] : Timed out while waiting for systemd to remove kubepods-besteffort-pod62d2ebe6_a49b_4835_bac7_86fbf33bd6c7.slice" pod="openstack/swift-proxy-6677596fcf-6rh2n" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.529791 4764 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod0baff485-3721-45b5-9177-96c30ce03251"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod0baff485-3721-45b5-9177-96c30ce03251] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0baff485_3721_45b5_9177_96c30ce03251.slice" Dec 04 00:05:11 crc kubenswrapper[4764]: E1204 00:05:11.529837 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod0baff485-3721-45b5-9177-96c30ce03251] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod0baff485-3721-45b5-9177-96c30ce03251] : Timed out while waiting for systemd to remove kubepods-besteffort-pod0baff485_3721_45b5_9177_96c30ce03251.slice" pod="openstack/nova-scheduler-0" podUID="0baff485-3721-45b5-9177-96c30ce03251" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.678451 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.766097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts\") pod \"05b8c157-de2d-4811-a625-1a77c3c7b37b\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.766201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg25z\" (UniqueName: \"kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z\") pod \"05b8c157-de2d-4811-a625-1a77c3c7b37b\" (UID: \"05b8c157-de2d-4811-a625-1a77c3c7b37b\") " Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.766827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05b8c157-de2d-4811-a625-1a77c3c7b37b" (UID: "05b8c157-de2d-4811-a625-1a77c3c7b37b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.782348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z" (OuterVolumeSpecName: "kube-api-access-dg25z") pod "05b8c157-de2d-4811-a625-1a77c3c7b37b" (UID: "05b8c157-de2d-4811-a625-1a77c3c7b37b"). InnerVolumeSpecName "kube-api-access-dg25z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.867528 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05b8c157-de2d-4811-a625-1a77c3c7b37b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:11 crc kubenswrapper[4764]: I1204 00:05:11.867577 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg25z\" (UniqueName: \"kubernetes.io/projected/05b8c157-de2d-4811-a625-1a77c3c7b37b-kube-api-access-dg25z\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.293478 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.380557 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.389332 4764 generic.go:334] "Generic (PLEG): container finished" podID="bd7a5353-be52-43e9-9490-530240b943fe" containerID="f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2" exitCode=137 Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.389423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement1158-account-delete-dqsj8" event={"ID":"bd7a5353-be52-43e9-9490-530240b943fe","Type":"ContainerDied","Data":"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2"} Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.389435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement1158-account-delete-dqsj8" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.389456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement1158-account-delete-dqsj8" event={"ID":"bd7a5353-be52-43e9-9490-530240b943fe","Type":"ContainerDied","Data":"bea567c860a67687e511f5e3309950a0b1674743af37c3d8008a61251396a5e2"} Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.389480 4764 scope.go:117] "RemoveContainer" containerID="f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.392154 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicanc616-account-delete-ls8lk" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.393123 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicanc616-account-delete-ls8lk" event={"ID":"05b8c157-de2d-4811-a625-1a77c3c7b37b","Type":"ContainerDied","Data":"2af4ca914fe57db488f67f940ea99bb6dad3c032cc7e5065b17a93cceba79ced"} Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.398863 4764 generic.go:334] "Generic (PLEG): container finished" podID="0fd61cda-9474-470d-aee3-9806975eccaf" containerID="28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7" exitCode=137 Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.398992 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.400187 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapifdfe-account-delete-lncz4" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.400468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifdfe-account-delete-lncz4" event={"ID":"0fd61cda-9474-470d-aee3-9806975eccaf","Type":"ContainerDied","Data":"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7"} Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.400507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapifdfe-account-delete-lncz4" event={"ID":"0fd61cda-9474-470d-aee3-9806975eccaf","Type":"ContainerDied","Data":"8368d75af9aa569c86737121e680f0a271b2e9342b16b6cf9f892f561cf9669a"} Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.400662 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6677596fcf-6rh2n" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.401105 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts\") pod \"0fd61cda-9474-470d-aee3-9806975eccaf\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.401292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq64z\" (UniqueName: \"kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z\") pod \"0fd61cda-9474-470d-aee3-9806975eccaf\" (UID: \"0fd61cda-9474-470d-aee3-9806975eccaf\") " Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.402069 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0fd61cda-9474-470d-aee3-9806975eccaf" (UID: "0fd61cda-9474-470d-aee3-9806975eccaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.405402 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z" (OuterVolumeSpecName: "kube-api-access-hq64z") pod "0fd61cda-9474-470d-aee3-9806975eccaf" (UID: "0fd61cda-9474-470d-aee3-9806975eccaf"). InnerVolumeSpecName "kube-api-access-hq64z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.418598 4764 scope.go:117] "RemoveContainer" containerID="f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2" Dec 04 00:05:12 crc kubenswrapper[4764]: E1204 00:05:12.420998 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2\": container with ID starting with f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2 not found: ID does not exist" containerID="f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.421048 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2"} err="failed to get container status \"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2\": rpc error: code = NotFound desc = could not find container \"f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2\": container with ID starting with f048fab9ffe12c15deb5f86a02c42d32ac2bc7c55f3ab0b0ffb628d3b14465d2 not found: ID does not exist" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.421076 4764 scope.go:117] "RemoveContainer" containerID="74e119881855aa11e997be4d47415f2b08315ba62b5e9a6db9ebd6318b71cac9" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.444363 4764 scope.go:117] "RemoveContainer" containerID="28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.447542 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.461099 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.473221 4764 scope.go:117] "RemoveContainer" containerID="28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7" Dec 04 00:05:12 crc kubenswrapper[4764]: E1204 00:05:12.473884 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7\": container with ID starting with 28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7 not found: ID does not exist" containerID="28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.473925 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7"} err="failed to get container status \"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7\": rpc error: code = NotFound desc = could not find container \"28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7\": container with ID starting with 28e7354a046d3f49f02f8129158f5b9b8a21d3537e568909552cfbc520de0ef7 not found: ID does not exist" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.479452 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.485518 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6677596fcf-6rh2n"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.490282 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.495210 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicanc616-account-delete-ls8lk"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.502846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts\") pod \"bd7a5353-be52-43e9-9490-530240b943fe\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.503091 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dsls\" (UniqueName: \"kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls\") pod \"bd7a5353-be52-43e9-9490-530240b943fe\" (UID: \"bd7a5353-be52-43e9-9490-530240b943fe\") " Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.503248 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd7a5353-be52-43e9-9490-530240b943fe" (UID: "bd7a5353-be52-43e9-9490-530240b943fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.503464 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd7a5353-be52-43e9-9490-530240b943fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.503548 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0fd61cda-9474-470d-aee3-9806975eccaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.503642 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq64z\" (UniqueName: \"kubernetes.io/projected/0fd61cda-9474-470d-aee3-9806975eccaf-kube-api-access-hq64z\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.505964 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls" (OuterVolumeSpecName: "kube-api-access-9dsls") pod "bd7a5353-be52-43e9-9490-530240b943fe" (UID: "bd7a5353-be52-43e9-9490-530240b943fe"). InnerVolumeSpecName "kube-api-access-9dsls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.555801 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05b8c157-de2d-4811-a625-1a77c3c7b37b" path="/var/lib/kubelet/pods/05b8c157-de2d-4811-a625-1a77c3c7b37b/volumes" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.556449 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0baff485-3721-45b5-9177-96c30ce03251" path="/var/lib/kubelet/pods/0baff485-3721-45b5-9177-96c30ce03251/volumes" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.557054 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155f4570-7769-42ab-8bc0-168dba070531" path="/var/lib/kubelet/pods/155f4570-7769-42ab-8bc0-168dba070531/volumes" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.558300 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" path="/var/lib/kubelet/pods/62d2ebe6-a49b-4835-bac7-86fbf33bd6c7/volumes" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.605756 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dsls\" (UniqueName: \"kubernetes.io/projected/bd7a5353-be52-43e9-9490-530240b943fe-kube-api-access-9dsls\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.710978 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.715524 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement1158-account-delete-dqsj8"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.732423 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:05:12 crc kubenswrapper[4764]: I1204 00:05:12.749763 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapifdfe-account-delete-lncz4"] Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.384157 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-754f454454-nb48r" podUID="803d2331-67a9-462d-9e22-09a112264732" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.152:5000/v3\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.405067 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.410056 4764 generic.go:334] "Generic (PLEG): container finished" podID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" containerID="858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925" exitCode=137 Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.410113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04790-account-delete-g9286" event={"ID":"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa","Type":"ContainerDied","Data":"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925"} Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.410135 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell04790-account-delete-g9286" event={"ID":"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa","Type":"ContainerDied","Data":"21b16aaac1bbdcab2a6868dac32429a999c87640ef00a0fb16cd38015e8b8be6"} Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.410151 4764 scope.go:117] "RemoveContainer" containerID="858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.410164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell04790-account-delete-g9286" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.414008 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.415875 4764 generic.go:334] "Generic (PLEG): container finished" podID="62ce6a22-4796-4b94-9c53-d3088cff26f1" containerID="fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c" exitCode=137 Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.415908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4ef6-account-delete-l854j" event={"ID":"62ce6a22-4796-4b94-9c53-d3088cff26f1","Type":"ContainerDied","Data":"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c"} Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.415932 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron4ef6-account-delete-l854j" event={"ID":"62ce6a22-4796-4b94-9c53-d3088cff26f1","Type":"ContainerDied","Data":"74c4a4212866150be76b57688d67c2614af9fceaf40e41900e4aadcf71403b1b"} Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.453881 4764 scope.go:117] "RemoveContainer" containerID="858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925" Dec 04 00:05:13 crc kubenswrapper[4764]: E1204 00:05:13.456104 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925\": container with ID starting with 858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925 not found: ID does not exist" containerID="858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.456144 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925"} err="failed to get container status \"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925\": rpc error: code = NotFound desc = could not find container \"858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925\": container with ID starting with 858d5f61bafcc2329e2ac110c5cfea03f20d85d5a94fe3c82cd2cafdb224e925 not found: ID does not exist" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.456171 4764 scope.go:117] "RemoveContainer" containerID="fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.488376 4764 scope.go:117] "RemoveContainer" containerID="fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c" Dec 04 00:05:13 crc kubenswrapper[4764]: E1204 00:05:13.488884 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c\": container with ID starting with fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c not found: ID does not exist" containerID="fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.488936 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c"} err="failed to get container status \"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c\": rpc error: code = NotFound desc = could not find container \"fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c\": container with ID starting with fca2baf9c15d771eb2f4aa6b9038827192b4a3eec61b8b885a649862b7a9fa1c not found: ID does not exist" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.525887 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnplj\" (UniqueName: \"kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj\") pod \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.525975 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts\") pod \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\" (UID: \"ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa\") " Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.526002 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts\") pod \"62ce6a22-4796-4b94-9c53-d3088cff26f1\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.526023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t76xs\" (UniqueName: \"kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs\") pod \"62ce6a22-4796-4b94-9c53-d3088cff26f1\" (UID: \"62ce6a22-4796-4b94-9c53-d3088cff26f1\") " Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.527183 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" (UID: "ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.527328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62ce6a22-4796-4b94-9c53-d3088cff26f1" (UID: "62ce6a22-4796-4b94-9c53-d3088cff26f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.535142 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj" (OuterVolumeSpecName: "kube-api-access-rnplj") pod "ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" (UID: "ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa"). InnerVolumeSpecName "kube-api-access-rnplj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.537352 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs" (OuterVolumeSpecName: "kube-api-access-t76xs") pod "62ce6a22-4796-4b94-9c53-d3088cff26f1" (UID: "62ce6a22-4796-4b94-9c53-d3088cff26f1"). InnerVolumeSpecName "kube-api-access-t76xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.627824 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnplj\" (UniqueName: \"kubernetes.io/projected/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-kube-api-access-rnplj\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.627850 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.627869 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ce6a22-4796-4b94-9c53-d3088cff26f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.627877 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t76xs\" (UniqueName: \"kubernetes.io/projected/62ce6a22-4796-4b94-9c53-d3088cff26f1-kube-api-access-t76xs\") on node \"crc\" DevicePath \"\"" Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.739828 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:05:13 crc kubenswrapper[4764]: I1204 00:05:13.797545 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell04790-account-delete-g9286"] Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.426549 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron4ef6-account-delete-l854j" Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.456754 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.464317 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron4ef6-account-delete-l854j"] Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.561858 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fd61cda-9474-470d-aee3-9806975eccaf" path="/var/lib/kubelet/pods/0fd61cda-9474-470d-aee3-9806975eccaf/volumes" Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.562857 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ce6a22-4796-4b94-9c53-d3088cff26f1" path="/var/lib/kubelet/pods/62ce6a22-4796-4b94-9c53-d3088cff26f1/volumes" Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.563846 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7a5353-be52-43e9-9490-530240b943fe" path="/var/lib/kubelet/pods/bd7a5353-be52-43e9-9490-530240b943fe/volumes" Dec 04 00:05:14 crc kubenswrapper[4764]: I1204 00:05:14.564790 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" path="/var/lib/kubelet/pods/ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa/volumes" Dec 04 00:05:20 crc kubenswrapper[4764]: I1204 00:05:20.869191 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:05:20 crc kubenswrapper[4764]: I1204 00:05:20.869870 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:05:50 crc kubenswrapper[4764]: I1204 00:05:50.869629 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:05:50 crc kubenswrapper[4764]: I1204 00:05:50.870252 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:05:50 crc kubenswrapper[4764]: I1204 00:05:50.870308 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:05:50 crc kubenswrapper[4764]: I1204 00:05:50.871088 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:05:50 crc kubenswrapper[4764]: I1204 00:05:50.871212 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a" gracePeriod=600 Dec 04 00:05:51 crc kubenswrapper[4764]: I1204 00:05:51.862358 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a" exitCode=0 Dec 04 00:05:51 crc kubenswrapper[4764]: I1204 00:05:51.862472 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a"} Dec 04 00:05:51 crc kubenswrapper[4764]: I1204 00:05:51.862749 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1"} Dec 04 00:05:51 crc kubenswrapper[4764]: I1204 00:05:51.862778 4764 scope.go:117] "RemoveContainer" containerID="a2a1a3ac2c269173a49ebd8f63b614762be69151c2f69effa92f89083eb82227" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.515070 4764 scope.go:117] "RemoveContainer" containerID="f95f4c5c24d79cd3be5e889124f00a07909163b761e1d21f5535b51adffcdebb" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.559355 4764 scope.go:117] "RemoveContainer" containerID="40932f1f044feae057b1145cd8eb76e3370493aa44a7c5f0f8b568439dbde7ab" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.587378 4764 scope.go:117] "RemoveContainer" containerID="2e6356fadfa1adfca6e96dc95381719e5e560cd54e6aadbce9637ca044d759f1" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.608813 4764 scope.go:117] "RemoveContainer" containerID="0a7a50e6e0b034c7eecd5bfe06e2369a2fb9f54069521e04f4cbe475120f4ccd" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.635118 4764 scope.go:117] "RemoveContainer" containerID="eaaabf3d0d46d02c83dcc8d6773188d2b52ac585608db6c2550768e3791de973" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.677291 4764 scope.go:117] "RemoveContainer" containerID="72afd36f841bf235f0a272b4efab0af76e96e440e7957530d140b81cfcc5ba3d" Dec 04 00:06:06 crc kubenswrapper[4764]: I1204 00:06:06.706471 4764 scope.go:117] "RemoveContainer" containerID="2d73d87901dced518fbff8a4656eb6e2c9223d98c28022fedebac00cf11bf4dc" Dec 04 00:07:06 crc kubenswrapper[4764]: I1204 00:07:06.892834 4764 scope.go:117] "RemoveContainer" containerID="d13466ee54bb26aca109c5905a35596e448e0e1848d2f230f30f0947a39a72a8" Dec 04 00:07:06 crc kubenswrapper[4764]: I1204 00:07:06.944451 4764 scope.go:117] "RemoveContainer" containerID="60ba5da4792a87b1525f0ffb79e190a4fc0bc794c1d89f435442a97dc33fe1b1" Dec 04 00:07:06 crc kubenswrapper[4764]: I1204 00:07:06.978011 4764 scope.go:117] "RemoveContainer" containerID="ab31569a4bf1bd2101806486db1ae7ca52640ffa88e43ec4ead8858ebfefecfb" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.007604 4764 scope.go:117] "RemoveContainer" containerID="f956e9fd6be8e36f08c3d2e54ef6651af5346794be3c9e0b2c337f6520a43b6f" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.047563 4764 scope.go:117] "RemoveContainer" containerID="673b3fe77d5ce18df617b4e4a03402aed38e4d2372f2fea89273e416b06250ea" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.066325 4764 scope.go:117] "RemoveContainer" containerID="f2a95f73654fe1711cc05ab55f3be9597380a0876777040a000064c07657a226" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.091259 4764 scope.go:117] "RemoveContainer" containerID="f8de5b50ece874cdc4fb530bcf1a5c776706712455e9b3f425f252f6afefe588" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.117848 4764 scope.go:117] "RemoveContainer" containerID="aeeb1f8cd6a357fb4921a9a73369f690e025e106efab990b4dd29d561bf85075" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.132563 4764 scope.go:117] "RemoveContainer" containerID="bb708e2062718e88da9e223f7790ff88531c35109792f053cd5238a10956d772" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.146451 4764 scope.go:117] "RemoveContainer" containerID="5762b0af951828ae8d0afb335be9c60e1432402c4b4fc601d8d05ef67a4d97cf" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.176967 4764 scope.go:117] "RemoveContainer" containerID="3ea7362b2e46685a8e6f0b8977d5d3a7b0a135c14758f5677b71e9e97057f3b0" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.216585 4764 scope.go:117] "RemoveContainer" containerID="2abca4237fb017c47159b6d2841736e119b9cc88ea0295253fb1efe2f7642af6" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.236248 4764 scope.go:117] "RemoveContainer" containerID="8abc6b33fb7fe9bf79a76e4d8b9e725d01e6670c7d048013f125d40d01b42df8" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.275134 4764 scope.go:117] "RemoveContainer" containerID="058b0b320e4f649818a6f21558010cb53f95ece570ce4947f8e23a700f404e0f" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.294021 4764 scope.go:117] "RemoveContainer" containerID="3f2b39a4c4fb701732f068df4fd94e2b91815aa2dc381f6953b092ba79953e8d" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.312093 4764 scope.go:117] "RemoveContainer" containerID="f1c48e30fd43fed9382b6ed8ffc8d238c4969b51e7d651411bac1bc7845c92ad" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.340566 4764 scope.go:117] "RemoveContainer" containerID="ed9a7bb446932db4138c96413b14192c82a00e714736a59bb5dacccc3da2a6dd" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.365809 4764 scope.go:117] "RemoveContainer" containerID="4cde283132633ce22f471f0f58be753fa22ff0b0595a209ec83ed97ac2675ac1" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.386250 4764 scope.go:117] "RemoveContainer" containerID="b663718dd91cec728714f509ace36974bc05c1712376e3e482576b0703933bac" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.402165 4764 scope.go:117] "RemoveContainer" containerID="a1651179f0300b2698a60d331e5872234d6b46c899dc9fbf2c21a9e9a05ec719" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.419301 4764 scope.go:117] "RemoveContainer" containerID="479e4e6aa54d02d281f466e2c912ec3552c93b6120c0b7b2a56e584d81dec421" Dec 04 00:07:07 crc kubenswrapper[4764]: I1204 00:07:07.452464 4764 scope.go:117] "RemoveContainer" containerID="df775a1093974954fab259790271001b41d634ad321f7517bb39959e4a9fed30" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.827350 4764 scope.go:117] "RemoveContainer" containerID="94f6b973870b4eea1c81fd05faef27612f6926a743b37cea298cec8baf77cde6" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.879348 4764 scope.go:117] "RemoveContainer" containerID="efeadb11f0ef1de95143824a054c252e1663f1c06d14e2f5070aa509d85dd5bf" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.901127 4764 scope.go:117] "RemoveContainer" containerID="9a0f09120101a0329c1149dc889b69fb1f61190068470f9d33bdfee1ebb1a78d" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.917330 4764 scope.go:117] "RemoveContainer" containerID="71b9323307a4b2c17ee25a0cbf4f507e0f54dcd057e06354d3d97cbd5b67d385" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.937888 4764 scope.go:117] "RemoveContainer" containerID="24fed489eecbcfbdd5a929e01e1c310bcf42cb3baa89d27eff4bf18f0bf16997" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.956150 4764 scope.go:117] "RemoveContainer" containerID="f135ac215141bc2bfaf1b9e15150725a2cd141128b694948ba130731a69c2c31" Dec 04 00:08:07 crc kubenswrapper[4764]: I1204 00:08:07.994087 4764 scope.go:117] "RemoveContainer" containerID="2398fc991b15ae90f82ce48cda9c9af81acdb7bfc25b1505a825f6d849c9810b" Dec 04 00:08:20 crc kubenswrapper[4764]: I1204 00:08:20.868680 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:08:20 crc kubenswrapper[4764]: I1204 00:08:20.869336 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:08:50 crc kubenswrapper[4764]: I1204 00:08:50.869422 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:08:50 crc kubenswrapper[4764]: I1204 00:08:50.870238 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.070073 4764 scope.go:117] "RemoveContainer" containerID="f01aa606ef44b4da2e845d6aab65dfe450e2d390badf9f1d24c7c03e50b3beb3" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.096368 4764 scope.go:117] "RemoveContainer" containerID="dce80c7bf64542d50a5a35a9db4e990247ca9ea297c3adc3ddaa099d19135abb" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.132474 4764 scope.go:117] "RemoveContainer" containerID="29a9588dd349d2237fc97952dcf0d1d4dd46c4740977eb7c054e0a2cb436f498" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.152956 4764 scope.go:117] "RemoveContainer" containerID="e37bdb0112bb5af89123cbef2d5134d36eba12df511aac7c26c14ffd85dc3bc2" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.180172 4764 scope.go:117] "RemoveContainer" containerID="e09650ac3d53e1b4d98c00a093044328b22aa9b28e30aaf0b37a44249f841cb1" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.215195 4764 scope.go:117] "RemoveContainer" containerID="fac18db9328e4748c82b0da435e7fd4230232dd05715f2d69d781fb2da6f13d5" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.269559 4764 scope.go:117] "RemoveContainer" containerID="c2572b9c7d658fe7e9003dfd8ad292f6e1b9dfdbf825855b6980c2f1bc6cc2a5" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.314483 4764 scope.go:117] "RemoveContainer" containerID="e26f24e6446b7d99d0fcfd4d59fe0434ee5a259e6dcb1fa6dde3cea82250aad3" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.356985 4764 scope.go:117] "RemoveContainer" containerID="70f214ed8bf826d8b28709dc9706d2d24a3f20c75076ecb161b40b621f1f4eaa" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.384266 4764 scope.go:117] "RemoveContainer" containerID="5dd7869d059a76707a2c2390bd0370902c7d32aa5bdbbcfb6ebc6bde71de3eb3" Dec 04 00:09:08 crc kubenswrapper[4764]: I1204 00:09:08.404217 4764 scope.go:117] "RemoveContainer" containerID="6b48922dbe5bd6a9e5a211a4a63d2a70064129d41782d8dbd285376d4d5294eb" Dec 04 00:09:20 crc kubenswrapper[4764]: I1204 00:09:20.868955 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:09:20 crc kubenswrapper[4764]: I1204 00:09:20.869827 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:09:20 crc kubenswrapper[4764]: I1204 00:09:20.869924 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:09:20 crc kubenswrapper[4764]: I1204 00:09:20.871364 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:09:20 crc kubenswrapper[4764]: I1204 00:09:20.871486 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" gracePeriod=600 Dec 04 00:09:21 crc kubenswrapper[4764]: E1204 00:09:21.003439 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:09:21 crc kubenswrapper[4764]: I1204 00:09:21.118596 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" exitCode=0 Dec 04 00:09:21 crc kubenswrapper[4764]: I1204 00:09:21.118655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1"} Dec 04 00:09:21 crc kubenswrapper[4764]: I1204 00:09:21.118780 4764 scope.go:117] "RemoveContainer" containerID="b526b3cd6175d5387ff9cf45d5f19a22adfa1e890770ac42ded7e5b8a5bf721a" Dec 04 00:09:21 crc kubenswrapper[4764]: I1204 00:09:21.119588 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:09:21 crc kubenswrapper[4764]: E1204 00:09:21.120139 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:09:35 crc kubenswrapper[4764]: I1204 00:09:35.546686 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:09:35 crc kubenswrapper[4764]: E1204 00:09:35.547954 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:09:46 crc kubenswrapper[4764]: I1204 00:09:46.545945 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:09:46 crc kubenswrapper[4764]: E1204 00:09:46.546808 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:09:59 crc kubenswrapper[4764]: I1204 00:09:59.546224 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:09:59 crc kubenswrapper[4764]: E1204 00:09:59.547054 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:10:08 crc kubenswrapper[4764]: I1204 00:10:08.587822 4764 scope.go:117] "RemoveContainer" containerID="af3ad162775dbb086e4ee57f257df51f07acc80478f78e561812ab0e3ff974ec" Dec 04 00:10:10 crc kubenswrapper[4764]: I1204 00:10:10.545522 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:10:10 crc kubenswrapper[4764]: E1204 00:10:10.546097 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:10:21 crc kubenswrapper[4764]: I1204 00:10:21.545844 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:10:21 crc kubenswrapper[4764]: E1204 00:10:21.546574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:10:33 crc kubenswrapper[4764]: I1204 00:10:33.546841 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:10:33 crc kubenswrapper[4764]: E1204 00:10:33.547856 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:10:45 crc kubenswrapper[4764]: I1204 00:10:45.546022 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:10:45 crc kubenswrapper[4764]: E1204 00:10:45.546551 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:11:00 crc kubenswrapper[4764]: I1204 00:11:00.547016 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:11:00 crc kubenswrapper[4764]: E1204 00:11:00.548307 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:11:08 crc kubenswrapper[4764]: I1204 00:11:08.664405 4764 scope.go:117] "RemoveContainer" containerID="81e666919c04f56edfdbd9ce12296a18ba6926e9bd2b9f9641241de048362065" Dec 04 00:11:08 crc kubenswrapper[4764]: I1204 00:11:08.688901 4764 scope.go:117] "RemoveContainer" containerID="bb2e7ee1217f8d3f46ff3a25bdc283e42671a7b464b00ec6b61dc689fa57b84b" Dec 04 00:11:14 crc kubenswrapper[4764]: I1204 00:11:14.549832 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:11:14 crc kubenswrapper[4764]: E1204 00:11:14.550376 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:11:27 crc kubenswrapper[4764]: I1204 00:11:27.545527 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:11:27 crc kubenswrapper[4764]: E1204 00:11:27.546705 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:11:42 crc kubenswrapper[4764]: I1204 00:11:42.546255 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:11:42 crc kubenswrapper[4764]: E1204 00:11:42.547443 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:11:55 crc kubenswrapper[4764]: I1204 00:11:55.546116 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:11:55 crc kubenswrapper[4764]: E1204 00:11:55.546792 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:12:06 crc kubenswrapper[4764]: I1204 00:12:06.546292 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:12:06 crc kubenswrapper[4764]: E1204 00:12:06.547359 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:12:21 crc kubenswrapper[4764]: I1204 00:12:21.550489 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:12:21 crc kubenswrapper[4764]: E1204 00:12:21.551650 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:12:36 crc kubenswrapper[4764]: I1204 00:12:36.546255 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:12:36 crc kubenswrapper[4764]: E1204 00:12:36.547833 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:12:51 crc kubenswrapper[4764]: I1204 00:12:51.545919 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:12:51 crc kubenswrapper[4764]: E1204 00:12:51.546833 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:13:05 crc kubenswrapper[4764]: I1204 00:13:05.545364 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:13:05 crc kubenswrapper[4764]: E1204 00:13:05.546174 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:13:19 crc kubenswrapper[4764]: I1204 00:13:19.547002 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:13:19 crc kubenswrapper[4764]: E1204 00:13:19.548512 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:13:34 crc kubenswrapper[4764]: I1204 00:13:34.551045 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:13:34 crc kubenswrapper[4764]: E1204 00:13:34.552863 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:13:45 crc kubenswrapper[4764]: I1204 00:13:45.545692 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:13:45 crc kubenswrapper[4764]: E1204 00:13:45.546436 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:13:58 crc kubenswrapper[4764]: I1204 00:13:58.546201 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:13:58 crc kubenswrapper[4764]: E1204 00:13:58.546950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.662412 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.663898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.663931 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.663981 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ce6a22-4796-4b94-9c53-d3088cff26f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664002 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ce6a22-4796-4b94-9c53-d3088cff26f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664033 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803d2331-67a9-462d-9e22-09a112264732" containerName="keystone-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664050 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="803d2331-67a9-462d-9e22-09a112264732" containerName="keystone-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664077 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="ovsdbserver-sb" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664093 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="ovsdbserver-sb" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664130 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-central-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664151 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-central-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664184 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="setup-container" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664201 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="setup-container" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664231 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664248 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664268 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server-init" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664280 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server-init" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664305 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664318 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664341 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="setup-container" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664354 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="setup-container" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664375 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664387 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664468 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664485 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-server" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664508 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664523 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664554 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664592 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664608 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664623 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="rsync" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664637 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="rsync" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664654 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664672 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664698 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerName="kube-state-metrics" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664748 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerName="kube-state-metrics" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664785 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664802 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664827 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664844 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664866 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="swift-recon-cron" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664884 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="swift-recon-cron" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664917 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664934 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.664969 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.664988 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="cinder-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665028 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="cinder-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerName="nova-cell0-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665064 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerName="nova-cell0-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665086 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="mysql-bootstrap" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665104 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="mysql-bootstrap" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665126 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerName="memcached" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665142 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerName="memcached" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7a5353-be52-43e9-9490-530240b943fe" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665183 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7a5353-be52-43e9-9490-530240b943fe" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665213 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665252 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665269 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665290 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665306 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665330 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="mysql-bootstrap" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665347 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="mysql-bootstrap" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665367 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665383 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665416 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-notification-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665434 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-notification-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665465 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665483 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665504 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665521 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665544 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665562 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665583 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05b8c157-de2d-4811-a625-1a77c3c7b37b" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665596 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="05b8c157-de2d-4811-a625-1a77c3c7b37b" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665621 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-expirer" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665634 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-expirer" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665655 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0baff485-3721-45b5-9177-96c30ce03251" containerName="nova-scheduler-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665672 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0baff485-3721-45b5-9177-96c30ce03251" containerName="nova-scheduler-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665703 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665754 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665780 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665796 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665827 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665846 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-server" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665868 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665885 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665911 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665947 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.665965 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.665988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="sg-core" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666004 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="sg-core" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666030 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666046 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666093 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666111 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666143 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666158 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666186 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666199 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-server" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666221 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666235 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666252 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fd61cda-9474-470d-aee3-9806975eccaf" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666265 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fd61cda-9474-470d-aee3-9806975eccaf" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666288 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa27bce9-febf-497d-ad48-21b087064f34" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666301 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa27bce9-febf-497d-ad48-21b087064f34" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666320 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666332 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666349 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-reaper" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666362 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-reaper" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666383 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666397 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666411 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666424 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666443 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666458 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666474 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666524 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666553 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-server" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666600 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666619 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666649 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666689 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666706 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666766 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047e426c-4178-43ce-8a09-ff5b4a6a13f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666781 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="047e426c-4178-43ce-8a09-ff5b4a6a13f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666798 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666812 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666828 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666841 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666855 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155f4570-7769-42ab-8bc0-168dba070531" containerName="nova-cell1-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666868 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="155f4570-7769-42ab-8bc0-168dba070531" containerName="nova-cell1-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666887 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666900 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666916 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666928 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666946 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666958 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.666975 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="probe" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.666988 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="probe" Dec 04 00:14:07 crc kubenswrapper[4764]: E1204 00:14:07.667007 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667020 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667310 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fd61cda-9474-470d-aee3-9806975eccaf" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667335 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667382 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667408 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667431 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667454 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa27bce9-febf-497d-ad48-21b087064f34" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667488 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="803d2331-67a9-462d-9e22-09a112264732" containerName="keystone-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667505 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="662de035-d0f1-4a65-98ad-161d6f21bd26" containerName="memcached" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667535 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667565 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98b9272-87ec-43a2-97a7-7f08cdafbf2c" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667584 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667613 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovs-vswitchd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667633 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="ovn-northd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667650 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667682 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667976 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="swift-recon-cron" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.667998 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecdf180a-27ed-4dcd-a1c6-c84e575ad3aa" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668023 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7e0298-05be-4a37-a1d3-44632ea1d770" containerName="kube-state-metrics" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668054 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8499c909-53fe-4742-aa11-29e214451689" containerName="neutron-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668079 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f105a7d8-bb79-4578-98fd-aca60d5ffa10" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668096 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668116 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668139 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6160ab00-1691-41f8-9902-80d33e435770" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668195 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cad4f7f-7546-406c-822b-b6f77365d830" containerName="placement-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668227 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-notification-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668252 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7a5353-be52-43e9-9490-530240b943fe" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668271 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-api" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668285 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="ovsdbserver-sb" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668306 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="249c3a6b-9345-49ed-9b2d-a0991fb02dc0" containerName="openstack-network-exporter" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668328 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bda43f61-31ae-4c4c-967e-f0e8d13f5ae9" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668346 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8586564-9024-4375-a5f7-e75844abe723" containerName="galera" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668363 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0baff485-3721-45b5-9177-96c30ce03251" containerName="nova-scheduler-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668377 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ce6a22-4796-4b94-9c53-d3088cff26f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668401 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef3ecde-294a-410a-ba90-d08a00674b9f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668415 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-expirer" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668436 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="probe" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668455 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="80b73a08-1afd-4f2a-b565-fd8f2b06c0b6" containerName="cinder-scheduler" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668476 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee8deb66-8364-4d9c-bd17-e4ad937a35e2" containerName="barbican-keystone-listener-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668495 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="76708e9b-1db4-42ca-94d2-7ff96d08d855" containerName="rabbitmq" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668519 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668533 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-updater" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668546 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="rsync" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668568 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d2ebe6-a49b-4835-bac7-86fbf33bd6c7" containerName="proxy-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668587 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f850034-7f6e-4811-b98f-89648c559dcd" containerName="ovsdb-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668606 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae2d9b02-4247-444e-ba56-05d65493dd3e" containerName="nova-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668628 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="047e426c-4178-43ce-8a09-ff5b4a6a13f1" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668645 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="container-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668665 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="155f4570-7769-42ab-8bc0-168dba070531" containerName="nova-cell1-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668688 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668706 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="sg-core" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668834 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-reaper" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668856 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="58eedbd8-7bbd-444f-bd11-784c5e7429fa" containerName="nova-cell0-conductor-conductor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668876 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-server" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668898 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668914 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="account-auditor" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668930 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="05b8c157-de2d-4811-a625-1a77c3c7b37b" containerName="mariadb-account-delete" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668954 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.668983 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7323df53-27cc-46a0-ad81-1e916db379af" containerName="cinder-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669012 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="ceilometer-central-agent" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669037 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6de1323-46ca-460b-8a8f-620125ce1d7f" containerName="glance-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669066 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74af5cde-29d3-4ff7-803b-fb335fc8209c" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669091 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5514d84f-88e9-4b13-9b5d-1c17c15fc0b6" containerName="proxy-httpd" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669109 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9218a48-75e0-47ae-a2ac-2d2fa4d08971" containerName="nova-metadata-metadata" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669129 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a7dd687-d272-4102-bc70-199b44353a21" containerName="barbican-worker" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669146 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1691fb5b-c57a-4773-9710-347c99bd9712" containerName="object-replicator" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669169 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab264d6c-eecf-496f-b505-39b128dd8e44" containerName="ovn-controller" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.669185 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec3e74e4-e0bc-45a3-a568-c70087b73572" containerName="barbican-api-log" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.671704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.677356 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.849416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.849468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.849831 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpvz\" (UniqueName: \"kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.951803 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.951872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.951949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpvz\" (UniqueName: \"kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.952319 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.952448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.974654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpvz\" (UniqueName: \"kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz\") pod \"redhat-marketplace-jsnrv\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:07 crc kubenswrapper[4764]: I1204 00:14:07.999208 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.243750 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.247403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.253736 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.256945 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.256991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzxhk\" (UniqueName: \"kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.257045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.358595 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.358934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzxhk\" (UniqueName: \"kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.359035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.359206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.359542 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.379341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzxhk\" (UniqueName: \"kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk\") pod \"redhat-operators-r59p6\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.474637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:08 crc kubenswrapper[4764]: I1204 00:14:08.573533 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.013979 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:09 crc kubenswrapper[4764]: W1204 00:14:09.019917 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c567f8d_76a3_4f55_a61a_930b934047bb.slice/crio-a732cc1367e642c9f255a5eab37dbe8f4fd107b31d370610dbcb240417ee96b5 WatchSource:0}: Error finding container a732cc1367e642c9f255a5eab37dbe8f4fd107b31d370610dbcb240417ee96b5: Status 404 returned error can't find the container with id a732cc1367e642c9f255a5eab37dbe8f4fd107b31d370610dbcb240417ee96b5 Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.485435 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerID="25ab066ca02d1d23ca25264ed2ff74101e869b7fcf524345212e985a193e43dd" exitCode=0 Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.485517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerDied","Data":"25ab066ca02d1d23ca25264ed2ff74101e869b7fcf524345212e985a193e43dd"} Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.485770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerStarted","Data":"573a0480edf65ea98d61ba3a4beed960898917484642cf6c8cee4cb95be5d59a"} Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.488315 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.488362 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerID="50916ea74bb1ab58c9f82b4e154204dfdc7e16545aa4617a506de7120f4e19a9" exitCode=0 Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.488380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerDied","Data":"50916ea74bb1ab58c9f82b4e154204dfdc7e16545aa4617a506de7120f4e19a9"} Dec 04 00:14:09 crc kubenswrapper[4764]: I1204 00:14:09.488581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerStarted","Data":"a732cc1367e642c9f255a5eab37dbe8f4fd107b31d370610dbcb240417ee96b5"} Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.055587 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.063160 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.072309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.087906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.088049 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.088144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh86q\" (UniqueName: \"kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.189162 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.189218 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.189250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh86q\" (UniqueName: \"kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.189842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.190085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.227993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh86q\" (UniqueName: \"kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q\") pod \"certified-operators-72wz4\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.399206 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.498422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerStarted","Data":"51300f29424f52bb36ea11c98af112380cc6f227ae42d1e56cbece7a5b7da0fd"} Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.504022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerStarted","Data":"ca5a4d8bd71ad4d56b2ea9e85fd453de3d8fd4e7532d118c7842b8870c91afe5"} Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.635353 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.637687 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.646386 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.696045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.696129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdsx\" (UniqueName: \"kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.696196 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.797068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.797130 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdsx\" (UniqueName: \"kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.797166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.797580 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.797587 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.825315 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdsx\" (UniqueName: \"kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx\") pod \"community-operators-tjsbx\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:10 crc kubenswrapper[4764]: W1204 00:14:10.896978 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ccaff1b_c77d_4a49_b065_df422e743d3a.slice/crio-8b00d33406f818be7da0e6696f81dbef9659a4786f1c53e6ea0f403610e114c0 WatchSource:0}: Error finding container 8b00d33406f818be7da0e6696f81dbef9659a4786f1c53e6ea0f403610e114c0: Status 404 returned error can't find the container with id 8b00d33406f818be7da0e6696f81dbef9659a4786f1c53e6ea0f403610e114c0 Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.897923 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:10 crc kubenswrapper[4764]: I1204 00:14:10.956242 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.196134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.517947 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerID="51300f29424f52bb36ea11c98af112380cc6f227ae42d1e56cbece7a5b7da0fd" exitCode=0 Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.518018 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerDied","Data":"51300f29424f52bb36ea11c98af112380cc6f227ae42d1e56cbece7a5b7da0fd"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.522284 4764 generic.go:334] "Generic (PLEG): container finished" podID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerID="9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b" exitCode=0 Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.522348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerDied","Data":"9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.522410 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerStarted","Data":"9d798480f482884bf22ed180e50b189b5145603d3b6a52942f4cfc5952d0d496"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.525993 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerID="ca5a4d8bd71ad4d56b2ea9e85fd453de3d8fd4e7532d118c7842b8870c91afe5" exitCode=0 Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.526055 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerDied","Data":"ca5a4d8bd71ad4d56b2ea9e85fd453de3d8fd4e7532d118c7842b8870c91afe5"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.528330 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerID="2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276" exitCode=0 Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.528364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerDied","Data":"2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.528393 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerStarted","Data":"8b00d33406f818be7da0e6696f81dbef9659a4786f1c53e6ea0f403610e114c0"} Dec 04 00:14:11 crc kubenswrapper[4764]: I1204 00:14:11.546481 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:14:11 crc kubenswrapper[4764]: E1204 00:14:11.546804 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.553081 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerID="05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3" exitCode=0 Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.556568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerStarted","Data":"801839d807e6bc2936182602bf56894c440dc167e70950125d39cb36c05aca75"} Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.556735 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerDied","Data":"05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3"} Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.557423 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerStarted","Data":"3fc72dcc7e97e3a599f80abe86f6cce96037b74fc60521b5785596656ff3d27d"} Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.560680 4764 generic.go:334] "Generic (PLEG): container finished" podID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerID="98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5" exitCode=0 Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.560753 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerDied","Data":"98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5"} Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.603980 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r59p6" podStartSLOduration=2.038075765 podStartE2EDuration="4.603956593s" podCreationTimestamp="2025-12-04 00:14:08 +0000 UTC" firstStartedPulling="2025-12-04 00:14:09.489978682 +0000 UTC m=+1985.251303133" lastFinishedPulling="2025-12-04 00:14:12.05585952 +0000 UTC m=+1987.817183961" observedRunningTime="2025-12-04 00:14:12.584175858 +0000 UTC m=+1988.345500279" watchObservedRunningTime="2025-12-04 00:14:12.603956593 +0000 UTC m=+1988.365281004" Dec 04 00:14:12 crc kubenswrapper[4764]: I1204 00:14:12.677352 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jsnrv" podStartSLOduration=3.231458211 podStartE2EDuration="5.677327094s" podCreationTimestamp="2025-12-04 00:14:07 +0000 UTC" firstStartedPulling="2025-12-04 00:14:09.487917711 +0000 UTC m=+1985.249242142" lastFinishedPulling="2025-12-04 00:14:11.933786604 +0000 UTC m=+1987.695111025" observedRunningTime="2025-12-04 00:14:12.670462246 +0000 UTC m=+1988.431786647" watchObservedRunningTime="2025-12-04 00:14:12.677327094 +0000 UTC m=+1988.438651505" Dec 04 00:14:13 crc kubenswrapper[4764]: I1204 00:14:13.570704 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerStarted","Data":"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa"} Dec 04 00:14:13 crc kubenswrapper[4764]: I1204 00:14:13.573095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerStarted","Data":"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69"} Dec 04 00:14:13 crc kubenswrapper[4764]: I1204 00:14:13.624492 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72wz4" podStartSLOduration=2.180208152 podStartE2EDuration="3.624468611s" podCreationTimestamp="2025-12-04 00:14:10 +0000 UTC" firstStartedPulling="2025-12-04 00:14:11.530158377 +0000 UTC m=+1987.291482798" lastFinishedPulling="2025-12-04 00:14:12.974418846 +0000 UTC m=+1988.735743257" observedRunningTime="2025-12-04 00:14:13.619235213 +0000 UTC m=+1989.380559624" watchObservedRunningTime="2025-12-04 00:14:13.624468611 +0000 UTC m=+1989.385793022" Dec 04 00:14:13 crc kubenswrapper[4764]: I1204 00:14:13.666618 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjsbx" podStartSLOduration=2.221057476 podStartE2EDuration="3.666601206s" podCreationTimestamp="2025-12-04 00:14:10 +0000 UTC" firstStartedPulling="2025-12-04 00:14:11.524313754 +0000 UTC m=+1987.285638165" lastFinishedPulling="2025-12-04 00:14:12.969857484 +0000 UTC m=+1988.731181895" observedRunningTime="2025-12-04 00:14:13.663695424 +0000 UTC m=+1989.425019835" watchObservedRunningTime="2025-12-04 00:14:13.666601206 +0000 UTC m=+1989.427925617" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.000361 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.001018 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.057439 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.574133 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.574167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.620622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.672287 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:18 crc kubenswrapper[4764]: I1204 00:14:18.672466 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.400081 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.400571 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.431257 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.459515 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.644045 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jsnrv" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="registry-server" containerID="cri-o://3fc72dcc7e97e3a599f80abe86f6cce96037b74fc60521b5785596656ff3d27d" gracePeriod=2 Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.710153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.956424 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:20 crc kubenswrapper[4764]: I1204 00:14:20.956516 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:21 crc kubenswrapper[4764]: I1204 00:14:21.003129 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:21 crc kubenswrapper[4764]: I1204 00:14:21.031285 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:21 crc kubenswrapper[4764]: I1204 00:14:21.031556 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r59p6" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="registry-server" containerID="cri-o://801839d807e6bc2936182602bf56894c440dc167e70950125d39cb36c05aca75" gracePeriod=2 Dec 04 00:14:21 crc kubenswrapper[4764]: I1204 00:14:21.704207 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.432357 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.545702 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.670012 4764 generic.go:334] "Generic (PLEG): container finished" podID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerID="3fc72dcc7e97e3a599f80abe86f6cce96037b74fc60521b5785596656ff3d27d" exitCode=0 Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.670099 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerDied","Data":"3fc72dcc7e97e3a599f80abe86f6cce96037b74fc60521b5785596656ff3d27d"} Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.676126 4764 generic.go:334] "Generic (PLEG): container finished" podID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerID="801839d807e6bc2936182602bf56894c440dc167e70950125d39cb36c05aca75" exitCode=0 Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.676441 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72wz4" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="registry-server" containerID="cri-o://c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa" gracePeriod=2 Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.676836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerDied","Data":"801839d807e6bc2936182602bf56894c440dc167e70950125d39cb36c05aca75"} Dec 04 00:14:23 crc kubenswrapper[4764]: I1204 00:14:23.894545 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.002061 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content\") pod \"8c567f8d-76a3-4f55-a61a-930b934047bb\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.002146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzxhk\" (UniqueName: \"kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk\") pod \"8c567f8d-76a3-4f55-a61a-930b934047bb\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.002178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities\") pod \"8c567f8d-76a3-4f55-a61a-930b934047bb\" (UID: \"8c567f8d-76a3-4f55-a61a-930b934047bb\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.003600 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities" (OuterVolumeSpecName: "utilities") pod "8c567f8d-76a3-4f55-a61a-930b934047bb" (UID: "8c567f8d-76a3-4f55-a61a-930b934047bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.012384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk" (OuterVolumeSpecName: "kube-api-access-kzxhk") pod "8c567f8d-76a3-4f55-a61a-930b934047bb" (UID: "8c567f8d-76a3-4f55-a61a-930b934047bb"). InnerVolumeSpecName "kube-api-access-kzxhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.053835 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.103931 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzxhk\" (UniqueName: \"kubernetes.io/projected/8c567f8d-76a3-4f55-a61a-930b934047bb-kube-api-access-kzxhk\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.103961 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.118425 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c567f8d-76a3-4f55-a61a-930b934047bb" (UID: "8c567f8d-76a3-4f55-a61a-930b934047bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.131587 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.205254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwpvz\" (UniqueName: \"kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz\") pod \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.205355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities\") pod \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.205402 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content\") pod \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\" (UID: \"4ee85068-9627-4e7f-9e3e-2365aae5fef3\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.205599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content\") pod \"1ccaff1b-c77d-4a49-b065-df422e743d3a\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.205837 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c567f8d-76a3-4f55-a61a-930b934047bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.208995 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities" (OuterVolumeSpecName: "utilities") pod "4ee85068-9627-4e7f-9e3e-2365aae5fef3" (UID: "4ee85068-9627-4e7f-9e3e-2365aae5fef3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.209408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz" (OuterVolumeSpecName: "kube-api-access-pwpvz") pod "4ee85068-9627-4e7f-9e3e-2365aae5fef3" (UID: "4ee85068-9627-4e7f-9e3e-2365aae5fef3"). InnerVolumeSpecName "kube-api-access-pwpvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.229575 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ee85068-9627-4e7f-9e3e-2365aae5fef3" (UID: "4ee85068-9627-4e7f-9e3e-2365aae5fef3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.251599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ccaff1b-c77d-4a49-b065-df422e743d3a" (UID: "1ccaff1b-c77d-4a49-b065-df422e743d3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh86q\" (UniqueName: \"kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q\") pod \"1ccaff1b-c77d-4a49-b065-df422e743d3a\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306692 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities\") pod \"1ccaff1b-c77d-4a49-b065-df422e743d3a\" (UID: \"1ccaff1b-c77d-4a49-b065-df422e743d3a\") " Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306919 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwpvz\" (UniqueName: \"kubernetes.io/projected/4ee85068-9627-4e7f-9e3e-2365aae5fef3-kube-api-access-pwpvz\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306931 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306939 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ee85068-9627-4e7f-9e3e-2365aae5fef3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.306948 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.307709 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities" (OuterVolumeSpecName: "utilities") pod "1ccaff1b-c77d-4a49-b065-df422e743d3a" (UID: "1ccaff1b-c77d-4a49-b065-df422e743d3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.310665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q" (OuterVolumeSpecName: "kube-api-access-fh86q") pod "1ccaff1b-c77d-4a49-b065-df422e743d3a" (UID: "1ccaff1b-c77d-4a49-b065-df422e743d3a"). InnerVolumeSpecName "kube-api-access-fh86q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.415269 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh86q\" (UniqueName: \"kubernetes.io/projected/1ccaff1b-c77d-4a49-b065-df422e743d3a-kube-api-access-fh86q\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.415519 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ccaff1b-c77d-4a49-b065-df422e743d3a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.684799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1"} Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.687270 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jsnrv" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.687272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jsnrv" event={"ID":"4ee85068-9627-4e7f-9e3e-2365aae5fef3","Type":"ContainerDied","Data":"573a0480edf65ea98d61ba3a4beed960898917484642cf6c8cee4cb95be5d59a"} Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.687628 4764 scope.go:117] "RemoveContainer" containerID="3fc72dcc7e97e3a599f80abe86f6cce96037b74fc60521b5785596656ff3d27d" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.690941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r59p6" event={"ID":"8c567f8d-76a3-4f55-a61a-930b934047bb","Type":"ContainerDied","Data":"a732cc1367e642c9f255a5eab37dbe8f4fd107b31d370610dbcb240417ee96b5"} Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.690968 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r59p6" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.693553 4764 generic.go:334] "Generic (PLEG): container finished" podID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerID="c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa" exitCode=0 Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.693596 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerDied","Data":"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa"} Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.693673 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72wz4" event={"ID":"1ccaff1b-c77d-4a49-b065-df422e743d3a","Type":"ContainerDied","Data":"8b00d33406f818be7da0e6696f81dbef9659a4786f1c53e6ea0f403610e114c0"} Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.693777 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72wz4" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.710843 4764 scope.go:117] "RemoveContainer" containerID="51300f29424f52bb36ea11c98af112380cc6f227ae42d1e56cbece7a5b7da0fd" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.733059 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.736293 4764 scope.go:117] "RemoveContainer" containerID="25ab066ca02d1d23ca25264ed2ff74101e869b7fcf524345212e985a193e43dd" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.748232 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r59p6"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.758508 4764 scope.go:117] "RemoveContainer" containerID="801839d807e6bc2936182602bf56894c440dc167e70950125d39cb36c05aca75" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.758726 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.766143 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jsnrv"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.773048 4764 scope.go:117] "RemoveContainer" containerID="ca5a4d8bd71ad4d56b2ea9e85fd453de3d8fd4e7532d118c7842b8870c91afe5" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.773656 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.779559 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72wz4"] Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.789048 4764 scope.go:117] "RemoveContainer" containerID="50916ea74bb1ab58c9f82b4e154204dfdc7e16545aa4617a506de7120f4e19a9" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.817896 4764 scope.go:117] "RemoveContainer" containerID="c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.836594 4764 scope.go:117] "RemoveContainer" containerID="05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.854196 4764 scope.go:117] "RemoveContainer" containerID="2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.869784 4764 scope.go:117] "RemoveContainer" containerID="c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa" Dec 04 00:14:24 crc kubenswrapper[4764]: E1204 00:14:24.870336 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa\": container with ID starting with c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa not found: ID does not exist" containerID="c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.870370 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa"} err="failed to get container status \"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa\": rpc error: code = NotFound desc = could not find container \"c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa\": container with ID starting with c4cc1ccdbdfa7528c8bc01a2fe9a473087598688e4045aa54a95a943eaaf2eaa not found: ID does not exist" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.870391 4764 scope.go:117] "RemoveContainer" containerID="05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3" Dec 04 00:14:24 crc kubenswrapper[4764]: E1204 00:14:24.870979 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3\": container with ID starting with 05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3 not found: ID does not exist" containerID="05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.871042 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3"} err="failed to get container status \"05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3\": rpc error: code = NotFound desc = could not find container \"05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3\": container with ID starting with 05321979ef0d4b6ede3ddeff323698e5ae32da86db6aaf6754892fd83c6092b3 not found: ID does not exist" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.871079 4764 scope.go:117] "RemoveContainer" containerID="2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276" Dec 04 00:14:24 crc kubenswrapper[4764]: E1204 00:14:24.871456 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276\": container with ID starting with 2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276 not found: ID does not exist" containerID="2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276" Dec 04 00:14:24 crc kubenswrapper[4764]: I1204 00:14:24.871493 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276"} err="failed to get container status \"2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276\": rpc error: code = NotFound desc = could not find container \"2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276\": container with ID starting with 2ad05a83b11519a4d0a57c5b9ee1dca4b4c701c268062c2eb55da67fc7e71276 not found: ID does not exist" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.237216 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.237584 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tjsbx" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="registry-server" containerID="cri-o://0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69" gracePeriod=2 Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.614492 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.705195 4764 generic.go:334] "Generic (PLEG): container finished" podID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerID="0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69" exitCode=0 Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.705232 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjsbx" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.705253 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerDied","Data":"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69"} Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.706664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjsbx" event={"ID":"e57206c8-e2cc-4ba4-869c-addb7da8eb7e","Type":"ContainerDied","Data":"9d798480f482884bf22ed180e50b189b5145603d3b6a52942f4cfc5952d0d496"} Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.706687 4764 scope.go:117] "RemoveContainer" containerID="0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.740798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content\") pod \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.740865 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities\") pod \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.740889 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdsx\" (UniqueName: \"kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx\") pod \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\" (UID: \"e57206c8-e2cc-4ba4-869c-addb7da8eb7e\") " Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.746567 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities" (OuterVolumeSpecName: "utilities") pod "e57206c8-e2cc-4ba4-869c-addb7da8eb7e" (UID: "e57206c8-e2cc-4ba4-869c-addb7da8eb7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.751551 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx" (OuterVolumeSpecName: "kube-api-access-spdsx") pod "e57206c8-e2cc-4ba4-869c-addb7da8eb7e" (UID: "e57206c8-e2cc-4ba4-869c-addb7da8eb7e"). InnerVolumeSpecName "kube-api-access-spdsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.772551 4764 scope.go:117] "RemoveContainer" containerID="98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.842685 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.842739 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdsx\" (UniqueName: \"kubernetes.io/projected/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-kube-api-access-spdsx\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.857028 4764 scope.go:117] "RemoveContainer" containerID="9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.893176 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e57206c8-e2cc-4ba4-869c-addb7da8eb7e" (UID: "e57206c8-e2cc-4ba4-869c-addb7da8eb7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.905414 4764 scope.go:117] "RemoveContainer" containerID="0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69" Dec 04 00:14:25 crc kubenswrapper[4764]: E1204 00:14:25.906173 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69\": container with ID starting with 0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69 not found: ID does not exist" containerID="0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.906219 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69"} err="failed to get container status \"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69\": rpc error: code = NotFound desc = could not find container \"0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69\": container with ID starting with 0706bbc6f06ca5e096bb092c2b0dfa3ae7eeb77b88565c504f868307f4ac4c69 not found: ID does not exist" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.906246 4764 scope.go:117] "RemoveContainer" containerID="98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5" Dec 04 00:14:25 crc kubenswrapper[4764]: E1204 00:14:25.907800 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5\": container with ID starting with 98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5 not found: ID does not exist" containerID="98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.907831 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5"} err="failed to get container status \"98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5\": rpc error: code = NotFound desc = could not find container \"98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5\": container with ID starting with 98615bf81357722679abcdcd31cd12b371981d550bcf89d939efa615ddc989a5 not found: ID does not exist" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.907849 4764 scope.go:117] "RemoveContainer" containerID="9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b" Dec 04 00:14:25 crc kubenswrapper[4764]: E1204 00:14:25.909986 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b\": container with ID starting with 9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b not found: ID does not exist" containerID="9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.910011 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b"} err="failed to get container status \"9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b\": rpc error: code = NotFound desc = could not find container \"9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b\": container with ID starting with 9516664947d4e097382812d0f1f011f491f7f553fdc80e257a25a57034508a7b not found: ID does not exist" Dec 04 00:14:25 crc kubenswrapper[4764]: I1204 00:14:25.943884 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57206c8-e2cc-4ba4-869c-addb7da8eb7e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.035534 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.040280 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tjsbx"] Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.555688 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" path="/var/lib/kubelet/pods/1ccaff1b-c77d-4a49-b065-df422e743d3a/volumes" Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.556581 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" path="/var/lib/kubelet/pods/4ee85068-9627-4e7f-9e3e-2365aae5fef3/volumes" Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.557346 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" path="/var/lib/kubelet/pods/8c567f8d-76a3-4f55-a61a-930b934047bb/volumes" Dec 04 00:14:26 crc kubenswrapper[4764]: I1204 00:14:26.558857 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" path="/var/lib/kubelet/pods/e57206c8-e2cc-4ba4-869c-addb7da8eb7e/volumes" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.156955 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn"] Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.157913 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.157930 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.157950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.157958 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.157970 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.157979 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.157992 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158001 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158013 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158021 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158035 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158043 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158060 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158067 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158081 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158089 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="extract-content" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158105 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158113 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158126 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158134 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="extract-utilities" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158145 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158153 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: E1204 00:15:00.158172 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158180 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158341 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ccaff1b-c77d-4a49-b065-df422e743d3a" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158356 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57206c8-e2cc-4ba4-869c-addb7da8eb7e" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158367 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c567f8d-76a3-4f55-a61a-930b934047bb" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.158383 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee85068-9627-4e7f-9e3e-2365aae5fef3" containerName="registry-server" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.159038 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.163801 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.163805 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.166278 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn"] Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.212187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.212630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.212733 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skfrz\" (UniqueName: \"kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.313658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.313738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.313807 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skfrz\" (UniqueName: \"kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.314840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.332579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.332927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skfrz\" (UniqueName: \"kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz\") pod \"collect-profiles-29413455-7slcn\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.497018 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:00 crc kubenswrapper[4764]: I1204 00:15:00.988620 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn"] Dec 04 00:15:00 crc kubenswrapper[4764]: W1204 00:15:00.998088 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20927d90_3909_4581_879a_0aaf6d4997c7.slice/crio-b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0 WatchSource:0}: Error finding container b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0: Status 404 returned error can't find the container with id b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0 Dec 04 00:15:01 crc kubenswrapper[4764]: I1204 00:15:01.047090 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" event={"ID":"20927d90-3909-4581-879a-0aaf6d4997c7","Type":"ContainerStarted","Data":"b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0"} Dec 04 00:15:02 crc kubenswrapper[4764]: I1204 00:15:02.058961 4764 generic.go:334] "Generic (PLEG): container finished" podID="20927d90-3909-4581-879a-0aaf6d4997c7" containerID="d9a170544faac2c83798b4c713fc3edde3f3b3efa9b309a497a722a6d2a4d8d2" exitCode=0 Dec 04 00:15:02 crc kubenswrapper[4764]: I1204 00:15:02.059097 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" event={"ID":"20927d90-3909-4581-879a-0aaf6d4997c7","Type":"ContainerDied","Data":"d9a170544faac2c83798b4c713fc3edde3f3b3efa9b309a497a722a6d2a4d8d2"} Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.389450 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.465051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume\") pod \"20927d90-3909-4581-879a-0aaf6d4997c7\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.465141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skfrz\" (UniqueName: \"kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz\") pod \"20927d90-3909-4581-879a-0aaf6d4997c7\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.465292 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume\") pod \"20927d90-3909-4581-879a-0aaf6d4997c7\" (UID: \"20927d90-3909-4581-879a-0aaf6d4997c7\") " Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.465974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume" (OuterVolumeSpecName: "config-volume") pod "20927d90-3909-4581-879a-0aaf6d4997c7" (UID: "20927d90-3909-4581-879a-0aaf6d4997c7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.470133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz" (OuterVolumeSpecName: "kube-api-access-skfrz") pod "20927d90-3909-4581-879a-0aaf6d4997c7" (UID: "20927d90-3909-4581-879a-0aaf6d4997c7"). InnerVolumeSpecName "kube-api-access-skfrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.471174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20927d90-3909-4581-879a-0aaf6d4997c7" (UID: "20927d90-3909-4581-879a-0aaf6d4997c7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.567667 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20927d90-3909-4581-879a-0aaf6d4997c7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.567703 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skfrz\" (UniqueName: \"kubernetes.io/projected/20927d90-3909-4581-879a-0aaf6d4997c7-kube-api-access-skfrz\") on node \"crc\" DevicePath \"\"" Dec 04 00:15:03 crc kubenswrapper[4764]: I1204 00:15:03.567748 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20927d90-3909-4581-879a-0aaf6d4997c7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.079428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" event={"ID":"20927d90-3909-4581-879a-0aaf6d4997c7","Type":"ContainerDied","Data":"b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0"} Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.079467 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b3fe45934538c203c06d28d5bcb50101f1b3ac007ce073626d177a294ba6e0" Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.079502 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn" Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.479599 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8"] Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.488495 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413410-zrhh8"] Dec 04 00:15:04 crc kubenswrapper[4764]: I1204 00:15:04.565664 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="220eeac0-5b43-462d-89cc-5182a6b1f686" path="/var/lib/kubelet/pods/220eeac0-5b43-462d-89cc-5182a6b1f686/volumes" Dec 04 00:15:08 crc kubenswrapper[4764]: I1204 00:15:08.812315 4764 scope.go:117] "RemoveContainer" containerID="83dee69a305d81012986a61e20e4a0e4baaf0715ff083fe7493c972bf8c9c231" Dec 04 00:16:50 crc kubenswrapper[4764]: I1204 00:16:50.869259 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:16:50 crc kubenswrapper[4764]: I1204 00:16:50.870092 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:17:20 crc kubenswrapper[4764]: I1204 00:17:20.869145 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:17:20 crc kubenswrapper[4764]: I1204 00:17:20.869752 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:17:50 crc kubenswrapper[4764]: I1204 00:17:50.869244 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:17:50 crc kubenswrapper[4764]: I1204 00:17:50.870084 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:17:50 crc kubenswrapper[4764]: I1204 00:17:50.870159 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:17:50 crc kubenswrapper[4764]: I1204 00:17:50.871212 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:17:50 crc kubenswrapper[4764]: I1204 00:17:50.871324 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1" gracePeriod=600 Dec 04 00:17:51 crc kubenswrapper[4764]: I1204 00:17:51.559679 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1" exitCode=0 Dec 04 00:17:51 crc kubenswrapper[4764]: I1204 00:17:51.560053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1"} Dec 04 00:17:51 crc kubenswrapper[4764]: I1204 00:17:51.560086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328"} Dec 04 00:17:51 crc kubenswrapper[4764]: I1204 00:17:51.560107 4764 scope.go:117] "RemoveContainer" containerID="6843fdd5cf9df3ca843cba128dcbaf3003aca4af6314f788bfbf009e42e3dfb1" Dec 04 00:20:20 crc kubenswrapper[4764]: I1204 00:20:20.870104 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:20:20 crc kubenswrapper[4764]: I1204 00:20:20.870838 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:20:50 crc kubenswrapper[4764]: I1204 00:20:50.868832 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:20:50 crc kubenswrapper[4764]: I1204 00:20:50.869411 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:21:20 crc kubenswrapper[4764]: I1204 00:21:20.868764 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:21:20 crc kubenswrapper[4764]: I1204 00:21:20.869847 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:21:20 crc kubenswrapper[4764]: I1204 00:21:20.869944 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:21:20 crc kubenswrapper[4764]: I1204 00:21:20.870630 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:21:20 crc kubenswrapper[4764]: I1204 00:21:20.870738 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" gracePeriod=600 Dec 04 00:21:21 crc kubenswrapper[4764]: E1204 00:21:21.008920 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:21:21 crc kubenswrapper[4764]: I1204 00:21:21.427542 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" exitCode=0 Dec 04 00:21:21 crc kubenswrapper[4764]: I1204 00:21:21.427595 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328"} Dec 04 00:21:21 crc kubenswrapper[4764]: I1204 00:21:21.427633 4764 scope.go:117] "RemoveContainer" containerID="e2982f62465c89a55ebdeee589aec4f46771fcfa41a60b199a2aa49d2d5e7eb1" Dec 04 00:21:21 crc kubenswrapper[4764]: I1204 00:21:21.429425 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:21:21 crc kubenswrapper[4764]: E1204 00:21:21.430181 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:21:35 crc kubenswrapper[4764]: I1204 00:21:35.549459 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:21:35 crc kubenswrapper[4764]: E1204 00:21:35.552515 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:21:49 crc kubenswrapper[4764]: I1204 00:21:49.546601 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:21:49 crc kubenswrapper[4764]: E1204 00:21:49.548292 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:21:52 crc kubenswrapper[4764]: I1204 00:21:52.929056 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n7jnx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 00:21:52 crc kubenswrapper[4764]: I1204 00:21:52.930365 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" podUID="82bbef02-a9d4-42e3-a874-f702e232be80" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 00:21:53 crc kubenswrapper[4764]: I1204 00:21:53.097167 4764 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-n7jnx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 00:21:53 crc kubenswrapper[4764]: I1204 00:21:53.097241 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-n7jnx" podUID="82bbef02-a9d4-42e3-a874-f702e232be80" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 00:22:03 crc kubenswrapper[4764]: I1204 00:22:03.546583 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:22:03 crc kubenswrapper[4764]: E1204 00:22:03.547850 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:22:17 crc kubenswrapper[4764]: I1204 00:22:17.546147 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:22:17 crc kubenswrapper[4764]: E1204 00:22:17.547197 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:22:31 crc kubenswrapper[4764]: I1204 00:22:31.545623 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:22:31 crc kubenswrapper[4764]: E1204 00:22:31.546562 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:22:46 crc kubenswrapper[4764]: I1204 00:22:46.545520 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:22:46 crc kubenswrapper[4764]: E1204 00:22:46.546388 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:23:01 crc kubenswrapper[4764]: I1204 00:23:01.546391 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:23:01 crc kubenswrapper[4764]: E1204 00:23:01.547131 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:23:14 crc kubenswrapper[4764]: I1204 00:23:14.554405 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:23:14 crc kubenswrapper[4764]: E1204 00:23:14.555607 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:23:28 crc kubenswrapper[4764]: I1204 00:23:28.546998 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:23:28 crc kubenswrapper[4764]: E1204 00:23:28.547963 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:23:40 crc kubenswrapper[4764]: I1204 00:23:40.545191 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:23:40 crc kubenswrapper[4764]: E1204 00:23:40.545950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:23:53 crc kubenswrapper[4764]: I1204 00:23:53.546143 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:23:53 crc kubenswrapper[4764]: E1204 00:23:53.546878 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:24:06 crc kubenswrapper[4764]: I1204 00:24:06.546463 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:24:06 crc kubenswrapper[4764]: E1204 00:24:06.547936 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:24:14 crc kubenswrapper[4764]: I1204 00:24:14.976113 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:14 crc kubenswrapper[4764]: E1204 00:24:14.977030 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20927d90-3909-4581-879a-0aaf6d4997c7" containerName="collect-profiles" Dec 04 00:24:14 crc kubenswrapper[4764]: I1204 00:24:14.977047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="20927d90-3909-4581-879a-0aaf6d4997c7" containerName="collect-profiles" Dec 04 00:24:14 crc kubenswrapper[4764]: I1204 00:24:14.977216 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="20927d90-3909-4581-879a-0aaf6d4997c7" containerName="collect-profiles" Dec 04 00:24:14 crc kubenswrapper[4764]: I1204 00:24:14.978391 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.002793 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.012008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pc7\" (UniqueName: \"kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.012048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.012072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.113437 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pc7\" (UniqueName: \"kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.113495 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.113523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.113978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.114031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.141930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pc7\" (UniqueName: \"kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7\") pod \"redhat-operators-n4wgf\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.299222 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:15 crc kubenswrapper[4764]: I1204 00:24:15.790309 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:16 crc kubenswrapper[4764]: I1204 00:24:16.208899 4764 generic.go:334] "Generic (PLEG): container finished" podID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerID="7cd01baecbb12bdbbbe7da2fce8257cc77d5460f2d50c99f7aa6b550ac101132" exitCode=0 Dec 04 00:24:16 crc kubenswrapper[4764]: I1204 00:24:16.208941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerDied","Data":"7cd01baecbb12bdbbbe7da2fce8257cc77d5460f2d50c99f7aa6b550ac101132"} Dec 04 00:24:16 crc kubenswrapper[4764]: I1204 00:24:16.208966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerStarted","Data":"09a4e034ca3ac58fd2424a90d87698ab91d329bde46ad0c8d00b0ac617d5507a"} Dec 04 00:24:16 crc kubenswrapper[4764]: I1204 00:24:16.210687 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:24:17 crc kubenswrapper[4764]: I1204 00:24:17.218581 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerStarted","Data":"7f9012b86965e2174d99fd707814ccddd2f7fd0f3cf098bcc66c788dc32b8a8b"} Dec 04 00:24:18 crc kubenswrapper[4764]: I1204 00:24:18.228706 4764 generic.go:334] "Generic (PLEG): container finished" podID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerID="7f9012b86965e2174d99fd707814ccddd2f7fd0f3cf098bcc66c788dc32b8a8b" exitCode=0 Dec 04 00:24:18 crc kubenswrapper[4764]: I1204 00:24:18.228799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerDied","Data":"7f9012b86965e2174d99fd707814ccddd2f7fd0f3cf098bcc66c788dc32b8a8b"} Dec 04 00:24:19 crc kubenswrapper[4764]: I1204 00:24:19.239062 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerStarted","Data":"966d8cfbd442d9a95501d8c6fb2195fd1c59ed5a74361a83744c14e81c77a7d1"} Dec 04 00:24:19 crc kubenswrapper[4764]: I1204 00:24:19.265379 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4wgf" podStartSLOduration=2.625652717 podStartE2EDuration="5.265359676s" podCreationTimestamp="2025-12-04 00:24:14 +0000 UTC" firstStartedPulling="2025-12-04 00:24:16.210390133 +0000 UTC m=+2591.971714544" lastFinishedPulling="2025-12-04 00:24:18.850097082 +0000 UTC m=+2594.611421503" observedRunningTime="2025-12-04 00:24:19.257369979 +0000 UTC m=+2595.018694400" watchObservedRunningTime="2025-12-04 00:24:19.265359676 +0000 UTC m=+2595.026684097" Dec 04 00:24:20 crc kubenswrapper[4764]: I1204 00:24:20.547480 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:24:20 crc kubenswrapper[4764]: E1204 00:24:20.547786 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:24:25 crc kubenswrapper[4764]: I1204 00:24:25.299546 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:25 crc kubenswrapper[4764]: I1204 00:24:25.300193 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:25 crc kubenswrapper[4764]: I1204 00:24:25.347402 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:26 crc kubenswrapper[4764]: I1204 00:24:26.338464 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:26 crc kubenswrapper[4764]: I1204 00:24:26.399276 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:27 crc kubenswrapper[4764]: I1204 00:24:27.988199 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:27 crc kubenswrapper[4764]: I1204 00:24:27.990035 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.017803 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.118644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8w7\" (UniqueName: \"kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.118756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.118813 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.220322 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.220398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.220446 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8w7\" (UniqueName: \"kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.221531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.221570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.239373 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8w7\" (UniqueName: \"kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7\") pod \"community-operators-hnhw7\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.303280 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4wgf" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="registry-server" containerID="cri-o://966d8cfbd442d9a95501d8c6fb2195fd1c59ed5a74361a83744c14e81c77a7d1" gracePeriod=2 Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.314789 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:28 crc kubenswrapper[4764]: I1204 00:24:28.867106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:29 crc kubenswrapper[4764]: I1204 00:24:29.311578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerStarted","Data":"c5f2107e3d2906457eb11c05823bd11e4fc77bbeba564ac537b4e41b2c2f9bbb"} Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.319985 4764 generic.go:334] "Generic (PLEG): container finished" podID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerID="89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae" exitCode=0 Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.320042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerDied","Data":"89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae"} Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.324333 4764 generic.go:334] "Generic (PLEG): container finished" podID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerID="966d8cfbd442d9a95501d8c6fb2195fd1c59ed5a74361a83744c14e81c77a7d1" exitCode=0 Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.324371 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerDied","Data":"966d8cfbd442d9a95501d8c6fb2195fd1c59ed5a74361a83744c14e81c77a7d1"} Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.536259 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.661989 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content\") pod \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.662048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities\") pod \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.662161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2pc7\" (UniqueName: \"kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7\") pod \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\" (UID: \"44bfa28c-2c81-4c2b-8585-d970a6505ac2\") " Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.663727 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities" (OuterVolumeSpecName: "utilities") pod "44bfa28c-2c81-4c2b-8585-d970a6505ac2" (UID: "44bfa28c-2c81-4c2b-8585-d970a6505ac2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.670250 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7" (OuterVolumeSpecName: "kube-api-access-v2pc7") pod "44bfa28c-2c81-4c2b-8585-d970a6505ac2" (UID: "44bfa28c-2c81-4c2b-8585-d970a6505ac2"). InnerVolumeSpecName "kube-api-access-v2pc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.764361 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2pc7\" (UniqueName: \"kubernetes.io/projected/44bfa28c-2c81-4c2b-8585-d970a6505ac2-kube-api-access-v2pc7\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.764400 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.768867 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44bfa28c-2c81-4c2b-8585-d970a6505ac2" (UID: "44bfa28c-2c81-4c2b-8585-d970a6505ac2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:24:30 crc kubenswrapper[4764]: I1204 00:24:30.865967 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44bfa28c-2c81-4c2b-8585-d970a6505ac2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.338926 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerStarted","Data":"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e"} Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.343945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4wgf" event={"ID":"44bfa28c-2c81-4c2b-8585-d970a6505ac2","Type":"ContainerDied","Data":"09a4e034ca3ac58fd2424a90d87698ab91d329bde46ad0c8d00b0ac617d5507a"} Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.344008 4764 scope.go:117] "RemoveContainer" containerID="966d8cfbd442d9a95501d8c6fb2195fd1c59ed5a74361a83744c14e81c77a7d1" Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.344026 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4wgf" Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.393179 4764 scope.go:117] "RemoveContainer" containerID="7f9012b86965e2174d99fd707814ccddd2f7fd0f3cf098bcc66c788dc32b8a8b" Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.411503 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.420223 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4wgf"] Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.434378 4764 scope.go:117] "RemoveContainer" containerID="7cd01baecbb12bdbbbe7da2fce8257cc77d5460f2d50c99f7aa6b550ac101132" Dec 04 00:24:31 crc kubenswrapper[4764]: I1204 00:24:31.546155 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:24:31 crc kubenswrapper[4764]: E1204 00:24:31.546622 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:24:32 crc kubenswrapper[4764]: I1204 00:24:32.355469 4764 generic.go:334] "Generic (PLEG): container finished" podID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerID="1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e" exitCode=0 Dec 04 00:24:32 crc kubenswrapper[4764]: I1204 00:24:32.355565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerDied","Data":"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e"} Dec 04 00:24:32 crc kubenswrapper[4764]: I1204 00:24:32.356010 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerStarted","Data":"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc"} Dec 04 00:24:32 crc kubenswrapper[4764]: I1204 00:24:32.384707 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hnhw7" podStartSLOduration=3.929209138 podStartE2EDuration="5.384689486s" podCreationTimestamp="2025-12-04 00:24:27 +0000 UTC" firstStartedPulling="2025-12-04 00:24:30.322745936 +0000 UTC m=+2606.084070347" lastFinishedPulling="2025-12-04 00:24:31.778226264 +0000 UTC m=+2607.539550695" observedRunningTime="2025-12-04 00:24:32.379164289 +0000 UTC m=+2608.140488740" watchObservedRunningTime="2025-12-04 00:24:32.384689486 +0000 UTC m=+2608.146013897" Dec 04 00:24:32 crc kubenswrapper[4764]: I1204 00:24:32.556900 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" path="/var/lib/kubelet/pods/44bfa28c-2c81-4c2b-8585-d970a6505ac2/volumes" Dec 04 00:24:38 crc kubenswrapper[4764]: I1204 00:24:38.316017 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:38 crc kubenswrapper[4764]: I1204 00:24:38.316892 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:38 crc kubenswrapper[4764]: I1204 00:24:38.385954 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:38 crc kubenswrapper[4764]: I1204 00:24:38.448394 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:38 crc kubenswrapper[4764]: I1204 00:24:38.627913 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:40 crc kubenswrapper[4764]: I1204 00:24:40.420847 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hnhw7" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="registry-server" containerID="cri-o://fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc" gracePeriod=2 Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.386356 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.430521 4764 generic.go:334] "Generic (PLEG): container finished" podID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerID="fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc" exitCode=0 Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.430586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerDied","Data":"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc"} Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.430806 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hnhw7" event={"ID":"827a4e00-b9d9-461c-bb37-0676bfa9340e","Type":"ContainerDied","Data":"c5f2107e3d2906457eb11c05823bd11e4fc77bbeba564ac537b4e41b2c2f9bbb"} Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.430829 4764 scope.go:117] "RemoveContainer" containerID="fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.430635 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hnhw7" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.449997 4764 scope.go:117] "RemoveContainer" containerID="1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.468614 4764 scope.go:117] "RemoveContainer" containerID="89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.491884 4764 scope.go:117] "RemoveContainer" containerID="fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc" Dec 04 00:24:41 crc kubenswrapper[4764]: E1204 00:24:41.492452 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc\": container with ID starting with fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc not found: ID does not exist" containerID="fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.492493 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc"} err="failed to get container status \"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc\": rpc error: code = NotFound desc = could not find container \"fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc\": container with ID starting with fd717d00aaeed94f22a3c6e18c413a67101468874be116aeb9fa4360bb7789bc not found: ID does not exist" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.492517 4764 scope.go:117] "RemoveContainer" containerID="1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e" Dec 04 00:24:41 crc kubenswrapper[4764]: E1204 00:24:41.492863 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e\": container with ID starting with 1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e not found: ID does not exist" containerID="1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.492914 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e"} err="failed to get container status \"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e\": rpc error: code = NotFound desc = could not find container \"1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e\": container with ID starting with 1a6a1a091755026b3f6d43f4560e26412489fd106676b54f150a67f2e7dc517e not found: ID does not exist" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.492944 4764 scope.go:117] "RemoveContainer" containerID="89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae" Dec 04 00:24:41 crc kubenswrapper[4764]: E1204 00:24:41.493214 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae\": container with ID starting with 89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae not found: ID does not exist" containerID="89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.493253 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae"} err="failed to get container status \"89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae\": rpc error: code = NotFound desc = could not find container \"89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae\": container with ID starting with 89a7be95cfe866c529bc2581e12d2478463152d315d409ea71f57dea95bae0ae not found: ID does not exist" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.524954 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities\") pod \"827a4e00-b9d9-461c-bb37-0676bfa9340e\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.525041 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv8w7\" (UniqueName: \"kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7\") pod \"827a4e00-b9d9-461c-bb37-0676bfa9340e\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.525078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content\") pod \"827a4e00-b9d9-461c-bb37-0676bfa9340e\" (UID: \"827a4e00-b9d9-461c-bb37-0676bfa9340e\") " Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.526088 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities" (OuterVolumeSpecName: "utilities") pod "827a4e00-b9d9-461c-bb37-0676bfa9340e" (UID: "827a4e00-b9d9-461c-bb37-0676bfa9340e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.529973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7" (OuterVolumeSpecName: "kube-api-access-kv8w7") pod "827a4e00-b9d9-461c-bb37-0676bfa9340e" (UID: "827a4e00-b9d9-461c-bb37-0676bfa9340e"). InnerVolumeSpecName "kube-api-access-kv8w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.579173 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "827a4e00-b9d9-461c-bb37-0676bfa9340e" (UID: "827a4e00-b9d9-461c-bb37-0676bfa9340e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.626992 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.627029 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/827a4e00-b9d9-461c-bb37-0676bfa9340e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.627042 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv8w7\" (UniqueName: \"kubernetes.io/projected/827a4e00-b9d9-461c-bb37-0676bfa9340e-kube-api-access-kv8w7\") on node \"crc\" DevicePath \"\"" Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.769013 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:41 crc kubenswrapper[4764]: I1204 00:24:41.773779 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hnhw7"] Dec 04 00:24:42 crc kubenswrapper[4764]: I1204 00:24:42.563238 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" path="/var/lib/kubelet/pods/827a4e00-b9d9-461c-bb37-0676bfa9340e/volumes" Dec 04 00:24:44 crc kubenswrapper[4764]: I1204 00:24:44.557595 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:24:44 crc kubenswrapper[4764]: E1204 00:24:44.560574 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:24:57 crc kubenswrapper[4764]: I1204 00:24:57.545809 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:24:57 crc kubenswrapper[4764]: E1204 00:24:57.547024 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.099499 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100074 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100087 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100096 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="extract-utilities" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100103 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="extract-utilities" Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100115 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100122 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100131 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="extract-content" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100137 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="extract-content" Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100149 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="extract-utilities" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100154 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="extract-utilities" Dec 04 00:25:02 crc kubenswrapper[4764]: E1204 00:25:02.100166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="extract-content" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100172 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="extract-content" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100322 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44bfa28c-2c81-4c2b-8585-d970a6505ac2" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.100341 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="827a4e00-b9d9-461c-bb37-0676bfa9340e" containerName="registry-server" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.101302 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.117457 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.217652 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fs7h\" (UniqueName: \"kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.218022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.218140 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.320935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fs7h\" (UniqueName: \"kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.321059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.321101 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.322091 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.322101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.349994 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fs7h\" (UniqueName: \"kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h\") pod \"certified-operators-cg8tv\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.436626 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:02 crc kubenswrapper[4764]: I1204 00:25:02.885216 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:03 crc kubenswrapper[4764]: I1204 00:25:03.628375 4764 generic.go:334] "Generic (PLEG): container finished" podID="343153ac-727e-4460-b88c-75e398cc7a1f" containerID="71a5b19e000ffc987d548fd3c6e3262ccec544e9cc737168533ff1825237d95e" exitCode=0 Dec 04 00:25:03 crc kubenswrapper[4764]: I1204 00:25:03.628490 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerDied","Data":"71a5b19e000ffc987d548fd3c6e3262ccec544e9cc737168533ff1825237d95e"} Dec 04 00:25:03 crc kubenswrapper[4764]: I1204 00:25:03.628854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerStarted","Data":"b6426fd0eb5ab2cebe1f7e0ff91feb6a5182cbe2a781308760f6b898fbe8ecbd"} Dec 04 00:25:04 crc kubenswrapper[4764]: I1204 00:25:04.638523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerStarted","Data":"ec75c2807235335cbd9789a45e4fecfe90fb8fcaa9a0d39eb3ab33d4bdb809ac"} Dec 04 00:25:05 crc kubenswrapper[4764]: I1204 00:25:05.648822 4764 generic.go:334] "Generic (PLEG): container finished" podID="343153ac-727e-4460-b88c-75e398cc7a1f" containerID="ec75c2807235335cbd9789a45e4fecfe90fb8fcaa9a0d39eb3ab33d4bdb809ac" exitCode=0 Dec 04 00:25:05 crc kubenswrapper[4764]: I1204 00:25:05.648892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerDied","Data":"ec75c2807235335cbd9789a45e4fecfe90fb8fcaa9a0d39eb3ab33d4bdb809ac"} Dec 04 00:25:06 crc kubenswrapper[4764]: I1204 00:25:06.656256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerStarted","Data":"54249a1c638e9a446f9f5e23b8c9b41e71d323b23a78c7d28a4eb708f6e368c9"} Dec 04 00:25:06 crc kubenswrapper[4764]: I1204 00:25:06.678403 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cg8tv" podStartSLOduration=1.877630088 podStartE2EDuration="4.678348506s" podCreationTimestamp="2025-12-04 00:25:02 +0000 UTC" firstStartedPulling="2025-12-04 00:25:03.629783631 +0000 UTC m=+2639.391108042" lastFinishedPulling="2025-12-04 00:25:06.430502019 +0000 UTC m=+2642.191826460" observedRunningTime="2025-12-04 00:25:06.670341628 +0000 UTC m=+2642.431666049" watchObservedRunningTime="2025-12-04 00:25:06.678348506 +0000 UTC m=+2642.439672917" Dec 04 00:25:10 crc kubenswrapper[4764]: I1204 00:25:10.547885 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:25:10 crc kubenswrapper[4764]: E1204 00:25:10.548699 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:25:12 crc kubenswrapper[4764]: I1204 00:25:12.436995 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:12 crc kubenswrapper[4764]: I1204 00:25:12.437416 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:12 crc kubenswrapper[4764]: I1204 00:25:12.499741 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:12 crc kubenswrapper[4764]: I1204 00:25:12.774056 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:12 crc kubenswrapper[4764]: I1204 00:25:12.834138 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:14 crc kubenswrapper[4764]: I1204 00:25:14.721272 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cg8tv" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="registry-server" containerID="cri-o://54249a1c638e9a446f9f5e23b8c9b41e71d323b23a78c7d28a4eb708f6e368c9" gracePeriod=2 Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.732312 4764 generic.go:334] "Generic (PLEG): container finished" podID="343153ac-727e-4460-b88c-75e398cc7a1f" containerID="54249a1c638e9a446f9f5e23b8c9b41e71d323b23a78c7d28a4eb708f6e368c9" exitCode=0 Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.732397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerDied","Data":"54249a1c638e9a446f9f5e23b8c9b41e71d323b23a78c7d28a4eb708f6e368c9"} Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.850235 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.935328 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content\") pod \"343153ac-727e-4460-b88c-75e398cc7a1f\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.935473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fs7h\" (UniqueName: \"kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h\") pod \"343153ac-727e-4460-b88c-75e398cc7a1f\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.935572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities\") pod \"343153ac-727e-4460-b88c-75e398cc7a1f\" (UID: \"343153ac-727e-4460-b88c-75e398cc7a1f\") " Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.936492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities" (OuterVolumeSpecName: "utilities") pod "343153ac-727e-4460-b88c-75e398cc7a1f" (UID: "343153ac-727e-4460-b88c-75e398cc7a1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.940185 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h" (OuterVolumeSpecName: "kube-api-access-9fs7h") pod "343153ac-727e-4460-b88c-75e398cc7a1f" (UID: "343153ac-727e-4460-b88c-75e398cc7a1f"). InnerVolumeSpecName "kube-api-access-9fs7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:25:15 crc kubenswrapper[4764]: I1204 00:25:15.997101 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "343153ac-727e-4460-b88c-75e398cc7a1f" (UID: "343153ac-727e-4460-b88c-75e398cc7a1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.037751 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.037787 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/343153ac-727e-4460-b88c-75e398cc7a1f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.037808 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fs7h\" (UniqueName: \"kubernetes.io/projected/343153ac-727e-4460-b88c-75e398cc7a1f-kube-api-access-9fs7h\") on node \"crc\" DevicePath \"\"" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.742484 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cg8tv" event={"ID":"343153ac-727e-4460-b88c-75e398cc7a1f","Type":"ContainerDied","Data":"b6426fd0eb5ab2cebe1f7e0ff91feb6a5182cbe2a781308760f6b898fbe8ecbd"} Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.742541 4764 scope.go:117] "RemoveContainer" containerID="54249a1c638e9a446f9f5e23b8c9b41e71d323b23a78c7d28a4eb708f6e368c9" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.742551 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cg8tv" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.769008 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.770461 4764 scope.go:117] "RemoveContainer" containerID="ec75c2807235335cbd9789a45e4fecfe90fb8fcaa9a0d39eb3ab33d4bdb809ac" Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.778010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cg8tv"] Dec 04 00:25:16 crc kubenswrapper[4764]: I1204 00:25:16.794329 4764 scope.go:117] "RemoveContainer" containerID="71a5b19e000ffc987d548fd3c6e3262ccec544e9cc737168533ff1825237d95e" Dec 04 00:25:18 crc kubenswrapper[4764]: I1204 00:25:18.562477 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" path="/var/lib/kubelet/pods/343153ac-727e-4460-b88c-75e398cc7a1f/volumes" Dec 04 00:25:24 crc kubenswrapper[4764]: I1204 00:25:24.551382 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:25:24 crc kubenswrapper[4764]: E1204 00:25:24.551923 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:25:39 crc kubenswrapper[4764]: I1204 00:25:39.546651 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:25:39 crc kubenswrapper[4764]: E1204 00:25:39.549415 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:25:50 crc kubenswrapper[4764]: I1204 00:25:50.546352 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:25:50 crc kubenswrapper[4764]: E1204 00:25:50.547347 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:26:03 crc kubenswrapper[4764]: I1204 00:26:03.545508 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:26:03 crc kubenswrapper[4764]: E1204 00:26:03.546203 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:26:17 crc kubenswrapper[4764]: I1204 00:26:17.545799 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:26:17 crc kubenswrapper[4764]: E1204 00:26:17.546630 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:26:30 crc kubenswrapper[4764]: I1204 00:26:30.546399 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:26:31 crc kubenswrapper[4764]: I1204 00:26:31.445098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46"} Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.798886 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:01 crc kubenswrapper[4764]: E1204 00:28:01.799681 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="extract-utilities" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.799696 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="extract-utilities" Dec 04 00:28:01 crc kubenswrapper[4764]: E1204 00:28:01.799710 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="extract-content" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.799735 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="extract-content" Dec 04 00:28:01 crc kubenswrapper[4764]: E1204 00:28:01.799770 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="registry-server" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.799779 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="registry-server" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.799953 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="343153ac-727e-4460-b88c-75e398cc7a1f" containerName="registry-server" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.801101 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.825496 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.994461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.994541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vnz\" (UniqueName: \"kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:01 crc kubenswrapper[4764]: I1204 00:28:01.994726 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.096166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.096228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vnz\" (UniqueName: \"kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.096276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.096847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.096875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.117347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vnz\" (UniqueName: \"kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz\") pod \"redhat-marketplace-gzdnw\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.127087 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:02 crc kubenswrapper[4764]: I1204 00:28:02.591614 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:02 crc kubenswrapper[4764]: W1204 00:28:02.596553 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded9d76db_7c00_49ef_b25e_7b646ec7763a.slice/crio-7d4258f89b67ee9e174bd5f7aacd2c2534e12d7cd4237830e01da501f5c039f5 WatchSource:0}: Error finding container 7d4258f89b67ee9e174bd5f7aacd2c2534e12d7cd4237830e01da501f5c039f5: Status 404 returned error can't find the container with id 7d4258f89b67ee9e174bd5f7aacd2c2534e12d7cd4237830e01da501f5c039f5 Dec 04 00:28:03 crc kubenswrapper[4764]: I1204 00:28:03.145966 4764 generic.go:334] "Generic (PLEG): container finished" podID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerID="7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d" exitCode=0 Dec 04 00:28:03 crc kubenswrapper[4764]: I1204 00:28:03.146027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerDied","Data":"7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d"} Dec 04 00:28:03 crc kubenswrapper[4764]: I1204 00:28:03.146095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerStarted","Data":"7d4258f89b67ee9e174bd5f7aacd2c2534e12d7cd4237830e01da501f5c039f5"} Dec 04 00:28:04 crc kubenswrapper[4764]: I1204 00:28:04.162049 4764 generic.go:334] "Generic (PLEG): container finished" podID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerID="dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c" exitCode=0 Dec 04 00:28:04 crc kubenswrapper[4764]: I1204 00:28:04.162391 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerDied","Data":"dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c"} Dec 04 00:28:05 crc kubenswrapper[4764]: I1204 00:28:05.170648 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerStarted","Data":"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01"} Dec 04 00:28:05 crc kubenswrapper[4764]: I1204 00:28:05.188847 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gzdnw" podStartSLOduration=2.725804409 podStartE2EDuration="4.188802492s" podCreationTimestamp="2025-12-04 00:28:01 +0000 UTC" firstStartedPulling="2025-12-04 00:28:03.149124126 +0000 UTC m=+2818.910448547" lastFinishedPulling="2025-12-04 00:28:04.612122209 +0000 UTC m=+2820.373446630" observedRunningTime="2025-12-04 00:28:05.187204923 +0000 UTC m=+2820.948529334" watchObservedRunningTime="2025-12-04 00:28:05.188802492 +0000 UTC m=+2820.950126903" Dec 04 00:28:12 crc kubenswrapper[4764]: I1204 00:28:12.127813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:12 crc kubenswrapper[4764]: I1204 00:28:12.129851 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:12 crc kubenswrapper[4764]: I1204 00:28:12.206465 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:12 crc kubenswrapper[4764]: I1204 00:28:12.294082 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:12 crc kubenswrapper[4764]: I1204 00:28:12.448681 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:14 crc kubenswrapper[4764]: I1204 00:28:14.249387 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gzdnw" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="registry-server" containerID="cri-o://3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01" gracePeriod=2 Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.223100 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.258796 4764 generic.go:334] "Generic (PLEG): container finished" podID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerID="3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01" exitCode=0 Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.258853 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerDied","Data":"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01"} Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.258881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gzdnw" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.258897 4764 scope.go:117] "RemoveContainer" containerID="3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.258884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gzdnw" event={"ID":"ed9d76db-7c00-49ef-b25e-7b646ec7763a","Type":"ContainerDied","Data":"7d4258f89b67ee9e174bd5f7aacd2c2534e12d7cd4237830e01da501f5c039f5"} Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.279871 4764 scope.go:117] "RemoveContainer" containerID="dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.301613 4764 scope.go:117] "RemoveContainer" containerID="7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.350879 4764 scope.go:117] "RemoveContainer" containerID="3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01" Dec 04 00:28:15 crc kubenswrapper[4764]: E1204 00:28:15.351304 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01\": container with ID starting with 3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01 not found: ID does not exist" containerID="3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.351339 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01"} err="failed to get container status \"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01\": rpc error: code = NotFound desc = could not find container \"3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01\": container with ID starting with 3fad7db1a6338b710d9c19d12cd5d2e88ac45498ece81949e861cf01dc3ffe01 not found: ID does not exist" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.351357 4764 scope.go:117] "RemoveContainer" containerID="dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c" Dec 04 00:28:15 crc kubenswrapper[4764]: E1204 00:28:15.351649 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c\": container with ID starting with dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c not found: ID does not exist" containerID="dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.351672 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c"} err="failed to get container status \"dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c\": rpc error: code = NotFound desc = could not find container \"dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c\": container with ID starting with dda2594d6d732a368308ebfaa5d1981c0fbf9a897a7936134a3cd6e25416be8c not found: ID does not exist" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.351687 4764 scope.go:117] "RemoveContainer" containerID="7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d" Dec 04 00:28:15 crc kubenswrapper[4764]: E1204 00:28:15.357315 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d\": container with ID starting with 7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d not found: ID does not exist" containerID="7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.357354 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d"} err="failed to get container status \"7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d\": rpc error: code = NotFound desc = could not find container \"7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d\": container with ID starting with 7b05daadecdf97e821124d023c87963a85db9f956c1456dec4fe48c1fd82b53d not found: ID does not exist" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.399360 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities\") pod \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.399428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content\") pod \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.399549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vnz\" (UniqueName: \"kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz\") pod \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\" (UID: \"ed9d76db-7c00-49ef-b25e-7b646ec7763a\") " Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.400152 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities" (OuterVolumeSpecName: "utilities") pod "ed9d76db-7c00-49ef-b25e-7b646ec7763a" (UID: "ed9d76db-7c00-49ef-b25e-7b646ec7763a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.412135 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz" (OuterVolumeSpecName: "kube-api-access-v9vnz") pod "ed9d76db-7c00-49ef-b25e-7b646ec7763a" (UID: "ed9d76db-7c00-49ef-b25e-7b646ec7763a"). InnerVolumeSpecName "kube-api-access-v9vnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.418491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed9d76db-7c00-49ef-b25e-7b646ec7763a" (UID: "ed9d76db-7c00-49ef-b25e-7b646ec7763a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.501050 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.501079 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed9d76db-7c00-49ef-b25e-7b646ec7763a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.501088 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vnz\" (UniqueName: \"kubernetes.io/projected/ed9d76db-7c00-49ef-b25e-7b646ec7763a-kube-api-access-v9vnz\") on node \"crc\" DevicePath \"\"" Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.588758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:15 crc kubenswrapper[4764]: I1204 00:28:15.601481 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gzdnw"] Dec 04 00:28:16 crc kubenswrapper[4764]: I1204 00:28:16.558039 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" path="/var/lib/kubelet/pods/ed9d76db-7c00-49ef-b25e-7b646ec7763a/volumes" Dec 04 00:28:50 crc kubenswrapper[4764]: I1204 00:28:50.868991 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:28:50 crc kubenswrapper[4764]: I1204 00:28:50.869709 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:29:20 crc kubenswrapper[4764]: I1204 00:29:20.869314 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:29:20 crc kubenswrapper[4764]: I1204 00:29:20.869790 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:29:50 crc kubenswrapper[4764]: I1204 00:29:50.869456 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:29:50 crc kubenswrapper[4764]: I1204 00:29:50.870247 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:29:50 crc kubenswrapper[4764]: I1204 00:29:50.870319 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:29:50 crc kubenswrapper[4764]: I1204 00:29:50.871333 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:29:50 crc kubenswrapper[4764]: I1204 00:29:50.871440 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46" gracePeriod=600 Dec 04 00:29:51 crc kubenswrapper[4764]: I1204 00:29:51.063792 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46" exitCode=0 Dec 04 00:29:51 crc kubenswrapper[4764]: I1204 00:29:51.063875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46"} Dec 04 00:29:51 crc kubenswrapper[4764]: I1204 00:29:51.064194 4764 scope.go:117] "RemoveContainer" containerID="48327b01b7d141a3fb486875380a5451c93aae7214dd28996849364a358d1328" Dec 04 00:29:52 crc kubenswrapper[4764]: I1204 00:29:52.080622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6"} Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.146398 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8"] Dec 04 00:30:00 crc kubenswrapper[4764]: E1204 00:30:00.147135 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="registry-server" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.147148 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="registry-server" Dec 04 00:30:00 crc kubenswrapper[4764]: E1204 00:30:00.147180 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="extract-utilities" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.147187 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="extract-utilities" Dec 04 00:30:00 crc kubenswrapper[4764]: E1204 00:30:00.147200 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="extract-content" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.147206 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="extract-content" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.147324 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed9d76db-7c00-49ef-b25e-7b646ec7763a" containerName="registry-server" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.147801 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.150858 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.150900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.163235 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.163555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.163596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzvqh\" (UniqueName: \"kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.164555 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8"] Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.265630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.265745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.265772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzvqh\" (UniqueName: \"kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.267608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.272658 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.293884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzvqh\" (UniqueName: \"kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh\") pod \"collect-profiles-29413470-kcvf8\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.465405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:00 crc kubenswrapper[4764]: I1204 00:30:00.931280 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8"] Dec 04 00:30:01 crc kubenswrapper[4764]: I1204 00:30:01.163416 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" event={"ID":"1b2a5047-a6e6-4243-86dc-4ce470ab83af","Type":"ContainerStarted","Data":"f250af733a762211cec18f3a68cbcfb5b2b74f82d3044f92ab8bdaf07acad9c5"} Dec 04 00:30:02 crc kubenswrapper[4764]: I1204 00:30:02.173191 4764 generic.go:334] "Generic (PLEG): container finished" podID="1b2a5047-a6e6-4243-86dc-4ce470ab83af" containerID="9b756fc09137183b5a66608c09aa8cb4764e3e6f4c356e4a53f7ab70fd9c2ad8" exitCode=0 Dec 04 00:30:02 crc kubenswrapper[4764]: I1204 00:30:02.173237 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" event={"ID":"1b2a5047-a6e6-4243-86dc-4ce470ab83af","Type":"ContainerDied","Data":"9b756fc09137183b5a66608c09aa8cb4764e3e6f4c356e4a53f7ab70fd9c2ad8"} Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.555704 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.627207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzvqh\" (UniqueName: \"kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh\") pod \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.627266 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume\") pod \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.627294 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume\") pod \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\" (UID: \"1b2a5047-a6e6-4243-86dc-4ce470ab83af\") " Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.627875 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume" (OuterVolumeSpecName: "config-volume") pod "1b2a5047-a6e6-4243-86dc-4ce470ab83af" (UID: "1b2a5047-a6e6-4243-86dc-4ce470ab83af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.633873 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1b2a5047-a6e6-4243-86dc-4ce470ab83af" (UID: "1b2a5047-a6e6-4243-86dc-4ce470ab83af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.634582 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh" (OuterVolumeSpecName: "kube-api-access-lzvqh") pod "1b2a5047-a6e6-4243-86dc-4ce470ab83af" (UID: "1b2a5047-a6e6-4243-86dc-4ce470ab83af"). InnerVolumeSpecName "kube-api-access-lzvqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.729337 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1b2a5047-a6e6-4243-86dc-4ce470ab83af-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.729377 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzvqh\" (UniqueName: \"kubernetes.io/projected/1b2a5047-a6e6-4243-86dc-4ce470ab83af-kube-api-access-lzvqh\") on node \"crc\" DevicePath \"\"" Dec 04 00:30:03 crc kubenswrapper[4764]: I1204 00:30:03.729392 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1b2a5047-a6e6-4243-86dc-4ce470ab83af-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:30:04 crc kubenswrapper[4764]: I1204 00:30:04.188537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" event={"ID":"1b2a5047-a6e6-4243-86dc-4ce470ab83af","Type":"ContainerDied","Data":"f250af733a762211cec18f3a68cbcfb5b2b74f82d3044f92ab8bdaf07acad9c5"} Dec 04 00:30:04 crc kubenswrapper[4764]: I1204 00:30:04.188577 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f250af733a762211cec18f3a68cbcfb5b2b74f82d3044f92ab8bdaf07acad9c5" Dec 04 00:30:04 crc kubenswrapper[4764]: I1204 00:30:04.188626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8" Dec 04 00:30:04 crc kubenswrapper[4764]: I1204 00:30:04.626623 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2"] Dec 04 00:30:04 crc kubenswrapper[4764]: I1204 00:30:04.633761 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413425-n69h2"] Dec 04 00:30:06 crc kubenswrapper[4764]: I1204 00:30:06.555447 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e04e7e7-c3a8-45bb-834b-b35c6e74bcca" path="/var/lib/kubelet/pods/1e04e7e7-c3a8-45bb-834b-b35c6e74bcca/volumes" Dec 04 00:30:09 crc kubenswrapper[4764]: I1204 00:30:09.221362 4764 scope.go:117] "RemoveContainer" containerID="4084c667c523fa69892fc1a0a37c04c4401ec8a5b7b12e4ab4bd0347c10ceb2b" Dec 04 00:32:20 crc kubenswrapper[4764]: I1204 00:32:20.869074 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:32:20 crc kubenswrapper[4764]: I1204 00:32:20.869677 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:32:50 crc kubenswrapper[4764]: I1204 00:32:50.868643 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:32:50 crc kubenswrapper[4764]: I1204 00:32:50.869283 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:33:20 crc kubenswrapper[4764]: I1204 00:33:20.868738 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:33:20 crc kubenswrapper[4764]: I1204 00:33:20.869244 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:33:20 crc kubenswrapper[4764]: I1204 00:33:20.869294 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:33:20 crc kubenswrapper[4764]: I1204 00:33:20.869926 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:33:20 crc kubenswrapper[4764]: I1204 00:33:20.870004 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" gracePeriod=600 Dec 04 00:33:21 crc kubenswrapper[4764]: E1204 00:33:21.000396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:33:21 crc kubenswrapper[4764]: I1204 00:33:21.810043 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" exitCode=0 Dec 04 00:33:21 crc kubenswrapper[4764]: I1204 00:33:21.810382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6"} Dec 04 00:33:21 crc kubenswrapper[4764]: I1204 00:33:21.810431 4764 scope.go:117] "RemoveContainer" containerID="07dc22c5374547618804750ee2c27c6498a31303047f6eb665d94488cb9dff46" Dec 04 00:33:21 crc kubenswrapper[4764]: I1204 00:33:21.811102 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:33:21 crc kubenswrapper[4764]: E1204 00:33:21.811375 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:33:35 crc kubenswrapper[4764]: I1204 00:33:35.546141 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:33:35 crc kubenswrapper[4764]: E1204 00:33:35.546912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:33:46 crc kubenswrapper[4764]: I1204 00:33:46.546767 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:33:46 crc kubenswrapper[4764]: E1204 00:33:46.547423 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:33:57 crc kubenswrapper[4764]: I1204 00:33:57.545881 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:33:57 crc kubenswrapper[4764]: E1204 00:33:57.546627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:34:11 crc kubenswrapper[4764]: I1204 00:34:11.546214 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:34:11 crc kubenswrapper[4764]: E1204 00:34:11.547084 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:34:24 crc kubenswrapper[4764]: I1204 00:34:24.554045 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:34:24 crc kubenswrapper[4764]: E1204 00:34:24.555256 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:34:36 crc kubenswrapper[4764]: I1204 00:34:36.546320 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:34:36 crc kubenswrapper[4764]: E1204 00:34:36.547949 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:34:51 crc kubenswrapper[4764]: I1204 00:34:51.545553 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:34:51 crc kubenswrapper[4764]: E1204 00:34:51.546336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:00 crc kubenswrapper[4764]: I1204 00:35:00.835348 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:00 crc kubenswrapper[4764]: E1204 00:35:00.836817 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2a5047-a6e6-4243-86dc-4ce470ab83af" containerName="collect-profiles" Dec 04 00:35:00 crc kubenswrapper[4764]: I1204 00:35:00.836868 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2a5047-a6e6-4243-86dc-4ce470ab83af" containerName="collect-profiles" Dec 04 00:35:00 crc kubenswrapper[4764]: I1204 00:35:00.837257 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2a5047-a6e6-4243-86dc-4ce470ab83af" containerName="collect-profiles" Dec 04 00:35:00 crc kubenswrapper[4764]: I1204 00:35:00.839782 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:00 crc kubenswrapper[4764]: I1204 00:35:00.846016 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.022202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66f8h\" (UniqueName: \"kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.022253 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.022274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.123247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.123324 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.123497 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66f8h\" (UniqueName: \"kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.124048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.124094 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.149400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66f8h\" (UniqueName: \"kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h\") pod \"redhat-operators-5cmfz\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.173040 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.410525 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.676974 4764 generic.go:334] "Generic (PLEG): container finished" podID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerID="8db09fc3b87b5d912d794876d1ccf886672079e99e0cdb1e6c020dffc0dfc268" exitCode=0 Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.677222 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerDied","Data":"8db09fc3b87b5d912d794876d1ccf886672079e99e0cdb1e6c020dffc0dfc268"} Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.677347 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerStarted","Data":"e53937ebdd086ea128e4c6bd0ecc7a3d36f00fa33d9b5b8abc29a45148f77ed7"} Dec 04 00:35:01 crc kubenswrapper[4764]: I1204 00:35:01.678893 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:35:02 crc kubenswrapper[4764]: I1204 00:35:02.686303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerStarted","Data":"c0a3ca5d9998b5668057db2a161f472cb13955f348fd346ff24490499f117622"} Dec 04 00:35:03 crc kubenswrapper[4764]: I1204 00:35:03.700422 4764 generic.go:334] "Generic (PLEG): container finished" podID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerID="c0a3ca5d9998b5668057db2a161f472cb13955f348fd346ff24490499f117622" exitCode=0 Dec 04 00:35:03 crc kubenswrapper[4764]: I1204 00:35:03.700521 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerDied","Data":"c0a3ca5d9998b5668057db2a161f472cb13955f348fd346ff24490499f117622"} Dec 04 00:35:04 crc kubenswrapper[4764]: I1204 00:35:04.548845 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:35:04 crc kubenswrapper[4764]: E1204 00:35:04.549317 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:04 crc kubenswrapper[4764]: I1204 00:35:04.708994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerStarted","Data":"992ee3f71d988626b2c009d0cac4d8a1559e227e956effcb770dab69f3231ef0"} Dec 04 00:35:04 crc kubenswrapper[4764]: I1204 00:35:04.726765 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5cmfz" podStartSLOduration=2.313867731 podStartE2EDuration="4.726744445s" podCreationTimestamp="2025-12-04 00:35:00 +0000 UTC" firstStartedPulling="2025-12-04 00:35:01.67866014 +0000 UTC m=+3237.439984551" lastFinishedPulling="2025-12-04 00:35:04.091536854 +0000 UTC m=+3239.852861265" observedRunningTime="2025-12-04 00:35:04.726069088 +0000 UTC m=+3240.487393499" watchObservedRunningTime="2025-12-04 00:35:04.726744445 +0000 UTC m=+3240.488068856" Dec 04 00:35:11 crc kubenswrapper[4764]: I1204 00:35:11.173486 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:11 crc kubenswrapper[4764]: I1204 00:35:11.174901 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:11 crc kubenswrapper[4764]: I1204 00:35:11.243228 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:11 crc kubenswrapper[4764]: I1204 00:35:11.834191 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:11 crc kubenswrapper[4764]: I1204 00:35:11.878489 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:13 crc kubenswrapper[4764]: I1204 00:35:13.790027 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5cmfz" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="registry-server" containerID="cri-o://992ee3f71d988626b2c009d0cac4d8a1559e227e956effcb770dab69f3231ef0" gracePeriod=2 Dec 04 00:35:16 crc kubenswrapper[4764]: I1204 00:35:16.820976 4764 generic.go:334] "Generic (PLEG): container finished" podID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerID="992ee3f71d988626b2c009d0cac4d8a1559e227e956effcb770dab69f3231ef0" exitCode=0 Dec 04 00:35:16 crc kubenswrapper[4764]: I1204 00:35:16.821072 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerDied","Data":"992ee3f71d988626b2c009d0cac4d8a1559e227e956effcb770dab69f3231ef0"} Dec 04 00:35:16 crc kubenswrapper[4764]: I1204 00:35:16.939412 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.055582 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66f8h\" (UniqueName: \"kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h\") pod \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.055697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities\") pod \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.055761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content\") pod \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\" (UID: \"65bc62c2-8fd7-4ec2-948f-7f26f267fe22\") " Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.056589 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities" (OuterVolumeSpecName: "utilities") pod "65bc62c2-8fd7-4ec2-948f-7f26f267fe22" (UID: "65bc62c2-8fd7-4ec2-948f-7f26f267fe22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.061524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h" (OuterVolumeSpecName: "kube-api-access-66f8h") pod "65bc62c2-8fd7-4ec2-948f-7f26f267fe22" (UID: "65bc62c2-8fd7-4ec2-948f-7f26f267fe22"). InnerVolumeSpecName "kube-api-access-66f8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.157286 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66f8h\" (UniqueName: \"kubernetes.io/projected/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-kube-api-access-66f8h\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.157325 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.189647 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65bc62c2-8fd7-4ec2-948f-7f26f267fe22" (UID: "65bc62c2-8fd7-4ec2-948f-7f26f267fe22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.258931 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bc62c2-8fd7-4ec2-948f-7f26f267fe22-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.835165 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5cmfz" event={"ID":"65bc62c2-8fd7-4ec2-948f-7f26f267fe22","Type":"ContainerDied","Data":"e53937ebdd086ea128e4c6bd0ecc7a3d36f00fa33d9b5b8abc29a45148f77ed7"} Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.835278 4764 scope.go:117] "RemoveContainer" containerID="992ee3f71d988626b2c009d0cac4d8a1559e227e956effcb770dab69f3231ef0" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.835321 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5cmfz" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.867967 4764 scope.go:117] "RemoveContainer" containerID="c0a3ca5d9998b5668057db2a161f472cb13955f348fd346ff24490499f117622" Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.891564 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.899404 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5cmfz"] Dec 04 00:35:17 crc kubenswrapper[4764]: I1204 00:35:17.916062 4764 scope.go:117] "RemoveContainer" containerID="8db09fc3b87b5d912d794876d1ccf886672079e99e0cdb1e6c020dffc0dfc268" Dec 04 00:35:18 crc kubenswrapper[4764]: I1204 00:35:18.546532 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:35:18 crc kubenswrapper[4764]: E1204 00:35:18.546990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:18 crc kubenswrapper[4764]: I1204 00:35:18.565063 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" path="/var/lib/kubelet/pods/65bc62c2-8fd7-4ec2-948f-7f26f267fe22/volumes" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.578525 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wq2db"] Dec 04 00:35:19 crc kubenswrapper[4764]: E1204 00:35:19.579305 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="extract-content" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.579329 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="extract-content" Dec 04 00:35:19 crc kubenswrapper[4764]: E1204 00:35:19.579366 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="extract-utilities" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.579382 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="extract-utilities" Dec 04 00:35:19 crc kubenswrapper[4764]: E1204 00:35:19.579408 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="registry-server" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.579423 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="registry-server" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.579691 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bc62c2-8fd7-4ec2-948f-7f26f267fe22" containerName="registry-server" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.581922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.595320 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq2db"] Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.697863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-utilities\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.698001 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-catalog-content\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.698062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghdw4\" (UniqueName: \"kubernetes.io/projected/26c5f969-84ca-4652-8914-d01b7bbac800-kube-api-access-ghdw4\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.800176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-catalog-content\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.800242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghdw4\" (UniqueName: \"kubernetes.io/projected/26c5f969-84ca-4652-8914-d01b7bbac800-kube-api-access-ghdw4\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.800325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-utilities\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.800731 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-catalog-content\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.800779 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26c5f969-84ca-4652-8914-d01b7bbac800-utilities\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.823686 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghdw4\" (UniqueName: \"kubernetes.io/projected/26c5f969-84ca-4652-8914-d01b7bbac800-kube-api-access-ghdw4\") pod \"community-operators-wq2db\" (UID: \"26c5f969-84ca-4652-8914-d01b7bbac800\") " pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:19 crc kubenswrapper[4764]: I1204 00:35:19.910400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:20 crc kubenswrapper[4764]: I1204 00:35:20.459284 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq2db"] Dec 04 00:35:20 crc kubenswrapper[4764]: I1204 00:35:20.860604 4764 generic.go:334] "Generic (PLEG): container finished" podID="26c5f969-84ca-4652-8914-d01b7bbac800" containerID="cadf7d3b6dee53380f4b0b101d3cad2db3f604f055f840575b271f998863869e" exitCode=0 Dec 04 00:35:20 crc kubenswrapper[4764]: I1204 00:35:20.860660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq2db" event={"ID":"26c5f969-84ca-4652-8914-d01b7bbac800","Type":"ContainerDied","Data":"cadf7d3b6dee53380f4b0b101d3cad2db3f604f055f840575b271f998863869e"} Dec 04 00:35:20 crc kubenswrapper[4764]: I1204 00:35:20.860943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq2db" event={"ID":"26c5f969-84ca-4652-8914-d01b7bbac800","Type":"ContainerStarted","Data":"a0f40a44ab2afb01a681638524fd2cd2903056120673a5d9d47548ea8d747824"} Dec 04 00:35:24 crc kubenswrapper[4764]: I1204 00:35:24.896216 4764 generic.go:334] "Generic (PLEG): container finished" podID="26c5f969-84ca-4652-8914-d01b7bbac800" containerID="c6d0496377a440271f0fb0caba04a91b2d61edf027691eb49d57e2bedb31c97e" exitCode=0 Dec 04 00:35:24 crc kubenswrapper[4764]: I1204 00:35:24.896306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq2db" event={"ID":"26c5f969-84ca-4652-8914-d01b7bbac800","Type":"ContainerDied","Data":"c6d0496377a440271f0fb0caba04a91b2d61edf027691eb49d57e2bedb31c97e"} Dec 04 00:35:25 crc kubenswrapper[4764]: I1204 00:35:25.905894 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq2db" event={"ID":"26c5f969-84ca-4652-8914-d01b7bbac800","Type":"ContainerStarted","Data":"4bd6d06711367c0f3df97ed72a416d5c7926de7155ff118bb9014ddacca77dc8"} Dec 04 00:35:25 crc kubenswrapper[4764]: I1204 00:35:25.929369 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wq2db" podStartSLOduration=2.1385887 podStartE2EDuration="6.929343519s" podCreationTimestamp="2025-12-04 00:35:19 +0000 UTC" firstStartedPulling="2025-12-04 00:35:20.863588615 +0000 UTC m=+3256.624913036" lastFinishedPulling="2025-12-04 00:35:25.654343454 +0000 UTC m=+3261.415667855" observedRunningTime="2025-12-04 00:35:25.92089128 +0000 UTC m=+3261.682215701" watchObservedRunningTime="2025-12-04 00:35:25.929343519 +0000 UTC m=+3261.690667950" Dec 04 00:35:29 crc kubenswrapper[4764]: I1204 00:35:29.911433 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:29 crc kubenswrapper[4764]: I1204 00:35:29.911911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:29 crc kubenswrapper[4764]: I1204 00:35:29.966885 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:30 crc kubenswrapper[4764]: I1204 00:35:30.548542 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:35:30 crc kubenswrapper[4764]: E1204 00:35:30.550263 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:39 crc kubenswrapper[4764]: I1204 00:35:39.984486 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wq2db" Dec 04 00:35:40 crc kubenswrapper[4764]: I1204 00:35:40.076210 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq2db"] Dec 04 00:35:40 crc kubenswrapper[4764]: I1204 00:35:40.120490 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 04 00:35:40 crc kubenswrapper[4764]: I1204 00:35:40.120750 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-865zg" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="registry-server" containerID="cri-o://2f8088fe84d23c7a1a3fb472e6c00a4b47cebd6530ca96baad45f1e72ed3acda" gracePeriod=2 Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.017873 4764 generic.go:334] "Generic (PLEG): container finished" podID="947df551-a4ab-4b33-8c2f-1b535a557790" containerID="2f8088fe84d23c7a1a3fb472e6c00a4b47cebd6530ca96baad45f1e72ed3acda" exitCode=0 Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.017945 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerDied","Data":"2f8088fe84d23c7a1a3fb472e6c00a4b47cebd6530ca96baad45f1e72ed3acda"} Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.618276 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-865zg" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.712613 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nptfk\" (UniqueName: \"kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk\") pod \"947df551-a4ab-4b33-8c2f-1b535a557790\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.712948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities\") pod \"947df551-a4ab-4b33-8c2f-1b535a557790\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.713001 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content\") pod \"947df551-a4ab-4b33-8c2f-1b535a557790\" (UID: \"947df551-a4ab-4b33-8c2f-1b535a557790\") " Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.713506 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities" (OuterVolumeSpecName: "utilities") pod "947df551-a4ab-4b33-8c2f-1b535a557790" (UID: "947df551-a4ab-4b33-8c2f-1b535a557790"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.718394 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk" (OuterVolumeSpecName: "kube-api-access-nptfk") pod "947df551-a4ab-4b33-8c2f-1b535a557790" (UID: "947df551-a4ab-4b33-8c2f-1b535a557790"). InnerVolumeSpecName "kube-api-access-nptfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.774280 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "947df551-a4ab-4b33-8c2f-1b535a557790" (UID: "947df551-a4ab-4b33-8c2f-1b535a557790"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.814390 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nptfk\" (UniqueName: \"kubernetes.io/projected/947df551-a4ab-4b33-8c2f-1b535a557790-kube-api-access-nptfk\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.814428 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:41 crc kubenswrapper[4764]: I1204 00:35:41.814440 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/947df551-a4ab-4b33-8c2f-1b535a557790-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.029284 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-865zg" event={"ID":"947df551-a4ab-4b33-8c2f-1b535a557790","Type":"ContainerDied","Data":"2cf9cf1a1b004e158e0affae589108f3b99f208f1d8af1d20e53ae78165703c5"} Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.029334 4764 scope.go:117] "RemoveContainer" containerID="2f8088fe84d23c7a1a3fb472e6c00a4b47cebd6530ca96baad45f1e72ed3acda" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.029349 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-865zg" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.069072 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.073789 4764 scope.go:117] "RemoveContainer" containerID="aa2846c35039efa37416509ffb7daf24d37d2d3eb3ae7aecab345fb28f30ae63" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.080750 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-865zg"] Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.108373 4764 scope.go:117] "RemoveContainer" containerID="c693f4d795767afa7544be682bbb891277ec5f08419c6488ef5a6fbc57e9df9a" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.546167 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:35:42 crc kubenswrapper[4764]: E1204 00:35:42.546653 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:42 crc kubenswrapper[4764]: I1204 00:35:42.560288 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" path="/var/lib/kubelet/pods/947df551-a4ab-4b33-8c2f-1b535a557790/volumes" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.645593 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:35:53 crc kubenswrapper[4764]: E1204 00:35:53.648665 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="extract-content" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.648886 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="extract-content" Dec 04 00:35:53 crc kubenswrapper[4764]: E1204 00:35:53.648982 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="extract-utilities" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.649053 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="extract-utilities" Dec 04 00:35:53 crc kubenswrapper[4764]: E1204 00:35:53.649160 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="registry-server" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.649243 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="registry-server" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.649557 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="947df551-a4ab-4b33-8c2f-1b535a557790" containerName="registry-server" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.657410 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.662202 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.788254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.788320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.788383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxvnx\" (UniqueName: \"kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.890157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxvnx\" (UniqueName: \"kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.890278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.890321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.890814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.890927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.939701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxvnx\" (UniqueName: \"kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx\") pod \"certified-operators-98bgf\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:53 crc kubenswrapper[4764]: I1204 00:35:53.990185 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:35:54 crc kubenswrapper[4764]: I1204 00:35:54.472661 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:35:55 crc kubenswrapper[4764]: I1204 00:35:55.145764 4764 generic.go:334] "Generic (PLEG): container finished" podID="c88200aa-1a67-446b-aefb-b85027801e43" containerID="8477ab76d8655ae0c9d6e54842a85efdcc9c4039d73facd9cddb2037d62a216b" exitCode=0 Dec 04 00:35:55 crc kubenswrapper[4764]: I1204 00:35:55.145830 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerDied","Data":"8477ab76d8655ae0c9d6e54842a85efdcc9c4039d73facd9cddb2037d62a216b"} Dec 04 00:35:55 crc kubenswrapper[4764]: I1204 00:35:55.145871 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerStarted","Data":"025f6b56967d63d59bac83e13133b1d50b8fc26b0977f76fa2461d745b1961d8"} Dec 04 00:35:55 crc kubenswrapper[4764]: I1204 00:35:55.546324 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:35:55 crc kubenswrapper[4764]: E1204 00:35:55.546623 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:35:56 crc kubenswrapper[4764]: I1204 00:35:56.153809 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerStarted","Data":"0e21e8e51eeffe2ebde108a8581c5ff67de221da16cd78e316b1b0821b7635bc"} Dec 04 00:35:57 crc kubenswrapper[4764]: I1204 00:35:57.163265 4764 generic.go:334] "Generic (PLEG): container finished" podID="c88200aa-1a67-446b-aefb-b85027801e43" containerID="0e21e8e51eeffe2ebde108a8581c5ff67de221da16cd78e316b1b0821b7635bc" exitCode=0 Dec 04 00:35:57 crc kubenswrapper[4764]: I1204 00:35:57.163358 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerDied","Data":"0e21e8e51eeffe2ebde108a8581c5ff67de221da16cd78e316b1b0821b7635bc"} Dec 04 00:35:59 crc kubenswrapper[4764]: I1204 00:35:59.183192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerStarted","Data":"2838b56ecd606038d9da27629324eedd9696c823715347915a893a900a22f2be"} Dec 04 00:35:59 crc kubenswrapper[4764]: I1204 00:35:59.218269 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-98bgf" podStartSLOduration=3.23145797 podStartE2EDuration="6.218234975s" podCreationTimestamp="2025-12-04 00:35:53 +0000 UTC" firstStartedPulling="2025-12-04 00:35:55.147619751 +0000 UTC m=+3290.908944192" lastFinishedPulling="2025-12-04 00:35:58.134396766 +0000 UTC m=+3293.895721197" observedRunningTime="2025-12-04 00:35:59.215578619 +0000 UTC m=+3294.976903100" watchObservedRunningTime="2025-12-04 00:35:59.218234975 +0000 UTC m=+3294.979559436" Dec 04 00:36:03 crc kubenswrapper[4764]: I1204 00:36:03.990556 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:03 crc kubenswrapper[4764]: I1204 00:36:03.990925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:04 crc kubenswrapper[4764]: I1204 00:36:04.036787 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:04 crc kubenswrapper[4764]: I1204 00:36:04.295269 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:04 crc kubenswrapper[4764]: I1204 00:36:04.364551 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:36:06 crc kubenswrapper[4764]: I1204 00:36:06.240699 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-98bgf" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="registry-server" containerID="cri-o://2838b56ecd606038d9da27629324eedd9696c823715347915a893a900a22f2be" gracePeriod=2 Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.253129 4764 generic.go:334] "Generic (PLEG): container finished" podID="c88200aa-1a67-446b-aefb-b85027801e43" containerID="2838b56ecd606038d9da27629324eedd9696c823715347915a893a900a22f2be" exitCode=0 Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.253173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerDied","Data":"2838b56ecd606038d9da27629324eedd9696c823715347915a893a900a22f2be"} Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.546739 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:36:07 crc kubenswrapper[4764]: E1204 00:36:07.546915 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.842550 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.914927 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content\") pod \"c88200aa-1a67-446b-aefb-b85027801e43\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.915221 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxvnx\" (UniqueName: \"kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx\") pod \"c88200aa-1a67-446b-aefb-b85027801e43\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.915382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities\") pod \"c88200aa-1a67-446b-aefb-b85027801e43\" (UID: \"c88200aa-1a67-446b-aefb-b85027801e43\") " Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.916480 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities" (OuterVolumeSpecName: "utilities") pod "c88200aa-1a67-446b-aefb-b85027801e43" (UID: "c88200aa-1a67-446b-aefb-b85027801e43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.924297 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx" (OuterVolumeSpecName: "kube-api-access-bxvnx") pod "c88200aa-1a67-446b-aefb-b85027801e43" (UID: "c88200aa-1a67-446b-aefb-b85027801e43"). InnerVolumeSpecName "kube-api-access-bxvnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:36:07 crc kubenswrapper[4764]: I1204 00:36:07.973618 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c88200aa-1a67-446b-aefb-b85027801e43" (UID: "c88200aa-1a67-446b-aefb-b85027801e43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.017143 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.017194 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c88200aa-1a67-446b-aefb-b85027801e43-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.017209 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxvnx\" (UniqueName: \"kubernetes.io/projected/c88200aa-1a67-446b-aefb-b85027801e43-kube-api-access-bxvnx\") on node \"crc\" DevicePath \"\"" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.268676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-98bgf" event={"ID":"c88200aa-1a67-446b-aefb-b85027801e43","Type":"ContainerDied","Data":"025f6b56967d63d59bac83e13133b1d50b8fc26b0977f76fa2461d745b1961d8"} Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.268780 4764 scope.go:117] "RemoveContainer" containerID="2838b56ecd606038d9da27629324eedd9696c823715347915a893a900a22f2be" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.268817 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-98bgf" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.301804 4764 scope.go:117] "RemoveContainer" containerID="0e21e8e51eeffe2ebde108a8581c5ff67de221da16cd78e316b1b0821b7635bc" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.328276 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.332789 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-98bgf"] Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.341322 4764 scope.go:117] "RemoveContainer" containerID="8477ab76d8655ae0c9d6e54842a85efdcc9c4039d73facd9cddb2037d62a216b" Dec 04 00:36:08 crc kubenswrapper[4764]: I1204 00:36:08.555670 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c88200aa-1a67-446b-aefb-b85027801e43" path="/var/lib/kubelet/pods/c88200aa-1a67-446b-aefb-b85027801e43/volumes" Dec 04 00:36:20 crc kubenswrapper[4764]: I1204 00:36:20.546628 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:36:20 crc kubenswrapper[4764]: E1204 00:36:20.547355 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:36:35 crc kubenswrapper[4764]: I1204 00:36:35.546029 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:36:35 crc kubenswrapper[4764]: E1204 00:36:35.546766 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:36:46 crc kubenswrapper[4764]: I1204 00:36:46.545851 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:36:46 crc kubenswrapper[4764]: E1204 00:36:46.546537 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:36:59 crc kubenswrapper[4764]: I1204 00:36:59.547052 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:36:59 crc kubenswrapper[4764]: E1204 00:36:59.548336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:37:11 crc kubenswrapper[4764]: I1204 00:37:11.546350 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:37:11 crc kubenswrapper[4764]: E1204 00:37:11.548224 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:37:26 crc kubenswrapper[4764]: I1204 00:37:26.545665 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:37:26 crc kubenswrapper[4764]: E1204 00:37:26.546436 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:37:39 crc kubenswrapper[4764]: I1204 00:37:39.545844 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:37:39 crc kubenswrapper[4764]: E1204 00:37:39.546671 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:37:50 crc kubenswrapper[4764]: I1204 00:37:50.548414 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:37:50 crc kubenswrapper[4764]: E1204 00:37:50.550216 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:38:05 crc kubenswrapper[4764]: I1204 00:38:05.546007 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:38:05 crc kubenswrapper[4764]: E1204 00:38:05.546870 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:38:18 crc kubenswrapper[4764]: I1204 00:38:18.546303 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:38:18 crc kubenswrapper[4764]: E1204 00:38:18.547139 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:38:29 crc kubenswrapper[4764]: I1204 00:38:29.546045 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:38:30 crc kubenswrapper[4764]: I1204 00:38:30.459854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337"} Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.390056 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:27 crc kubenswrapper[4764]: E1204 00:39:27.391557 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="registry-server" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.391599 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="registry-server" Dec 04 00:39:27 crc kubenswrapper[4764]: E1204 00:39:27.391682 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="extract-content" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.391702 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="extract-content" Dec 04 00:39:27 crc kubenswrapper[4764]: E1204 00:39:27.391763 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="extract-utilities" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.391782 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="extract-utilities" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.392136 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c88200aa-1a67-446b-aefb-b85027801e43" containerName="registry-server" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.394361 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.408211 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.488793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.488854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdm2v\" (UniqueName: \"kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.489023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.590436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.590485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdm2v\" (UniqueName: \"kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.590553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.591016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.591036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.616927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdm2v\" (UniqueName: \"kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v\") pod \"redhat-marketplace-s8zgf\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:27 crc kubenswrapper[4764]: I1204 00:39:27.717202 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:28 crc kubenswrapper[4764]: I1204 00:39:28.157114 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:28 crc kubenswrapper[4764]: I1204 00:39:28.978550 4764 generic.go:334] "Generic (PLEG): container finished" podID="c4918c06-8304-4564-b676-8de26331ce48" containerID="afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204" exitCode=0 Dec 04 00:39:28 crc kubenswrapper[4764]: I1204 00:39:28.978597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerDied","Data":"afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204"} Dec 04 00:39:28 crc kubenswrapper[4764]: I1204 00:39:28.978626 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerStarted","Data":"eecd9193b3db9eaa7d71f4e4c7bfd75036e48f14038d415eb057c086d5e1572f"} Dec 04 00:39:29 crc kubenswrapper[4764]: I1204 00:39:29.990288 4764 generic.go:334] "Generic (PLEG): container finished" podID="c4918c06-8304-4564-b676-8de26331ce48" containerID="a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51" exitCode=0 Dec 04 00:39:29 crc kubenswrapper[4764]: I1204 00:39:29.990497 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerDied","Data":"a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51"} Dec 04 00:39:31 crc kubenswrapper[4764]: I1204 00:39:30.999860 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerStarted","Data":"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea"} Dec 04 00:39:31 crc kubenswrapper[4764]: I1204 00:39:31.025007 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s8zgf" podStartSLOduration=2.61344821 podStartE2EDuration="4.024986209s" podCreationTimestamp="2025-12-04 00:39:27 +0000 UTC" firstStartedPulling="2025-12-04 00:39:28.980765036 +0000 UTC m=+3504.742089437" lastFinishedPulling="2025-12-04 00:39:30.392303005 +0000 UTC m=+3506.153627436" observedRunningTime="2025-12-04 00:39:31.019864251 +0000 UTC m=+3506.781188672" watchObservedRunningTime="2025-12-04 00:39:31.024986209 +0000 UTC m=+3506.786310620" Dec 04 00:39:37 crc kubenswrapper[4764]: I1204 00:39:37.717652 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:37 crc kubenswrapper[4764]: I1204 00:39:37.718679 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:37 crc kubenswrapper[4764]: I1204 00:39:37.778858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:38 crc kubenswrapper[4764]: I1204 00:39:38.135979 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:38 crc kubenswrapper[4764]: I1204 00:39:38.196087 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.080053 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s8zgf" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="registry-server" containerID="cri-o://87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea" gracePeriod=2 Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.487756 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.541270 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdm2v\" (UniqueName: \"kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v\") pod \"c4918c06-8304-4564-b676-8de26331ce48\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.541354 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities\") pod \"c4918c06-8304-4564-b676-8de26331ce48\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.541452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content\") pod \"c4918c06-8304-4564-b676-8de26331ce48\" (UID: \"c4918c06-8304-4564-b676-8de26331ce48\") " Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.542361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities" (OuterVolumeSpecName: "utilities") pod "c4918c06-8304-4564-b676-8de26331ce48" (UID: "c4918c06-8304-4564-b676-8de26331ce48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.547147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v" (OuterVolumeSpecName: "kube-api-access-zdm2v") pod "c4918c06-8304-4564-b676-8de26331ce48" (UID: "c4918c06-8304-4564-b676-8de26331ce48"). InnerVolumeSpecName "kube-api-access-zdm2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.575115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4918c06-8304-4564-b676-8de26331ce48" (UID: "c4918c06-8304-4564-b676-8de26331ce48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.642820 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.642877 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdm2v\" (UniqueName: \"kubernetes.io/projected/c4918c06-8304-4564-b676-8de26331ce48-kube-api-access-zdm2v\") on node \"crc\" DevicePath \"\"" Dec 04 00:39:40 crc kubenswrapper[4764]: I1204 00:39:40.642898 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4918c06-8304-4564-b676-8de26331ce48-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.089195 4764 generic.go:334] "Generic (PLEG): container finished" podID="c4918c06-8304-4564-b676-8de26331ce48" containerID="87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea" exitCode=0 Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.089241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerDied","Data":"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea"} Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.089269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8zgf" event={"ID":"c4918c06-8304-4564-b676-8de26331ce48","Type":"ContainerDied","Data":"eecd9193b3db9eaa7d71f4e4c7bfd75036e48f14038d415eb057c086d5e1572f"} Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.089285 4764 scope.go:117] "RemoveContainer" containerID="87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.089302 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8zgf" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.115196 4764 scope.go:117] "RemoveContainer" containerID="a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.142267 4764 scope.go:117] "RemoveContainer" containerID="afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.157730 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.165688 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8zgf"] Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.175837 4764 scope.go:117] "RemoveContainer" containerID="87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea" Dec 04 00:39:41 crc kubenswrapper[4764]: E1204 00:39:41.176464 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea\": container with ID starting with 87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea not found: ID does not exist" containerID="87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.176508 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea"} err="failed to get container status \"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea\": rpc error: code = NotFound desc = could not find container \"87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea\": container with ID starting with 87b4f909944c574ab1d1c527297f669d92f5f263922530b6121d70e248bd13ea not found: ID does not exist" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.176535 4764 scope.go:117] "RemoveContainer" containerID="a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51" Dec 04 00:39:41 crc kubenswrapper[4764]: E1204 00:39:41.176968 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51\": container with ID starting with a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51 not found: ID does not exist" containerID="a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.176997 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51"} err="failed to get container status \"a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51\": rpc error: code = NotFound desc = could not find container \"a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51\": container with ID starting with a9ac13601da0fb2de6b1a6e95aa54e1c1ecd4ba0153851a09b70c38e9e750b51 not found: ID does not exist" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.177016 4764 scope.go:117] "RemoveContainer" containerID="afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204" Dec 04 00:39:41 crc kubenswrapper[4764]: E1204 00:39:41.177313 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204\": container with ID starting with afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204 not found: ID does not exist" containerID="afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204" Dec 04 00:39:41 crc kubenswrapper[4764]: I1204 00:39:41.177337 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204"} err="failed to get container status \"afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204\": rpc error: code = NotFound desc = could not find container \"afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204\": container with ID starting with afd33e85394b70ad5150364807bb8b26ff6b3013a36065a911ca4c9419cdf204 not found: ID does not exist" Dec 04 00:39:42 crc kubenswrapper[4764]: I1204 00:39:42.559526 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4918c06-8304-4564-b676-8de26331ce48" path="/var/lib/kubelet/pods/c4918c06-8304-4564-b676-8de26331ce48/volumes" Dec 04 00:40:50 crc kubenswrapper[4764]: I1204 00:40:50.868817 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:40:50 crc kubenswrapper[4764]: I1204 00:40:50.869411 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:41:20 crc kubenswrapper[4764]: I1204 00:41:20.868670 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:41:20 crc kubenswrapper[4764]: I1204 00:41:20.869276 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:41:50 crc kubenswrapper[4764]: I1204 00:41:50.869075 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:41:50 crc kubenswrapper[4764]: I1204 00:41:50.869806 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:41:50 crc kubenswrapper[4764]: I1204 00:41:50.869884 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:41:50 crc kubenswrapper[4764]: I1204 00:41:50.870870 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:41:50 crc kubenswrapper[4764]: I1204 00:41:50.871012 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337" gracePeriod=600 Dec 04 00:41:51 crc kubenswrapper[4764]: I1204 00:41:51.203852 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337" exitCode=0 Dec 04 00:41:51 crc kubenswrapper[4764]: I1204 00:41:51.203964 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337"} Dec 04 00:41:51 crc kubenswrapper[4764]: I1204 00:41:51.204130 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f"} Dec 04 00:41:51 crc kubenswrapper[4764]: I1204 00:41:51.204150 4764 scope.go:117] "RemoveContainer" containerID="3ba3cee0977a8c679df2f1d1f76849f4d4bdf8207a93d04815ebfdee1705c4c6" Dec 04 00:44:20 crc kubenswrapper[4764]: I1204 00:44:20.868377 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:44:20 crc kubenswrapper[4764]: I1204 00:44:20.869039 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:44:50 crc kubenswrapper[4764]: I1204 00:44:50.869526 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:44:50 crc kubenswrapper[4764]: I1204 00:44:50.870183 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.183429 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7"] Dec 04 00:45:00 crc kubenswrapper[4764]: E1204 00:45:00.184131 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="registry-server" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.184142 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="registry-server" Dec 04 00:45:00 crc kubenswrapper[4764]: E1204 00:45:00.184156 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="extract-utilities" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.184163 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="extract-utilities" Dec 04 00:45:00 crc kubenswrapper[4764]: E1204 00:45:00.184176 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="extract-content" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.184182 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="extract-content" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.184312 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4918c06-8304-4564-b676-8de26331ce48" containerName="registry-server" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.201394 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7"] Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.201794 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.205969 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.207626 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.209310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.209348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.209380 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk4hc\" (UniqueName: \"kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.310500 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.310556 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.310618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk4hc\" (UniqueName: \"kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.311545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.316103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.326307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk4hc\" (UniqueName: \"kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc\") pod \"collect-profiles-29413485-fvkm7\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.524628 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:00 crc kubenswrapper[4764]: I1204 00:45:00.975345 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7"] Dec 04 00:45:01 crc kubenswrapper[4764]: I1204 00:45:01.879951 4764 generic.go:334] "Generic (PLEG): container finished" podID="54cc2674-47e4-4b68-b220-1a68ada9eba8" containerID="1f17cd099233467c2a8b412f29eb98c130b798b5362256d3af61562971672cef" exitCode=0 Dec 04 00:45:01 crc kubenswrapper[4764]: I1204 00:45:01.880015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" event={"ID":"54cc2674-47e4-4b68-b220-1a68ada9eba8","Type":"ContainerDied","Data":"1f17cd099233467c2a8b412f29eb98c130b798b5362256d3af61562971672cef"} Dec 04 00:45:01 crc kubenswrapper[4764]: I1204 00:45:01.880283 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" event={"ID":"54cc2674-47e4-4b68-b220-1a68ada9eba8","Type":"ContainerStarted","Data":"15d9b36b24d336313cea85cc8b35283d8886ae857d696afe14bf659a2cc02db5"} Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.197381 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.353100 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume\") pod \"54cc2674-47e4-4b68-b220-1a68ada9eba8\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.353595 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume\") pod \"54cc2674-47e4-4b68-b220-1a68ada9eba8\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.353966 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume" (OuterVolumeSpecName: "config-volume") pod "54cc2674-47e4-4b68-b220-1a68ada9eba8" (UID: "54cc2674-47e4-4b68-b220-1a68ada9eba8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.354958 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk4hc\" (UniqueName: \"kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc\") pod \"54cc2674-47e4-4b68-b220-1a68ada9eba8\" (UID: \"54cc2674-47e4-4b68-b220-1a68ada9eba8\") " Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.355579 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54cc2674-47e4-4b68-b220-1a68ada9eba8-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.362690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc" (OuterVolumeSpecName: "kube-api-access-sk4hc") pod "54cc2674-47e4-4b68-b220-1a68ada9eba8" (UID: "54cc2674-47e4-4b68-b220-1a68ada9eba8"). InnerVolumeSpecName "kube-api-access-sk4hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.367939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54cc2674-47e4-4b68-b220-1a68ada9eba8" (UID: "54cc2674-47e4-4b68-b220-1a68ada9eba8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.456629 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54cc2674-47e4-4b68-b220-1a68ada9eba8-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.456656 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk4hc\" (UniqueName: \"kubernetes.io/projected/54cc2674-47e4-4b68-b220-1a68ada9eba8-kube-api-access-sk4hc\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.899063 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" event={"ID":"54cc2674-47e4-4b68-b220-1a68ada9eba8","Type":"ContainerDied","Data":"15d9b36b24d336313cea85cc8b35283d8886ae857d696afe14bf659a2cc02db5"} Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.899101 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15d9b36b24d336313cea85cc8b35283d8886ae857d696afe14bf659a2cc02db5" Dec 04 00:45:03 crc kubenswrapper[4764]: I1204 00:45:03.899160 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.271156 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9"] Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.277021 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413440-cjgm9"] Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.580136 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6683d2-f59f-42d2-8666-8960bce251af" path="/var/lib/kubelet/pods/cc6683d2-f59f-42d2-8666-8960bce251af/volumes" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.723224 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:04 crc kubenswrapper[4764]: E1204 00:45:04.723656 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cc2674-47e4-4b68-b220-1a68ada9eba8" containerName="collect-profiles" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.723681 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cc2674-47e4-4b68-b220-1a68ada9eba8" containerName="collect-profiles" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.723962 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cc2674-47e4-4b68-b220-1a68ada9eba8" containerName="collect-profiles" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.730605 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.756991 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.885278 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.885902 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7nws\" (UniqueName: \"kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.886050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.987052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7nws\" (UniqueName: \"kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.987384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.987422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.987886 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:04 crc kubenswrapper[4764]: I1204 00:45:04.988149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.005957 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7nws\" (UniqueName: \"kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws\") pod \"redhat-operators-4drqq\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.056821 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:05 crc kubenswrapper[4764]: W1204 00:45:05.477796 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d756ba6_598c_40dd_976e_5e55914e93c7.slice/crio-313d8815bb1add99da0de797472f575efc3514b096404238a0a693a11b6fd1a0 WatchSource:0}: Error finding container 313d8815bb1add99da0de797472f575efc3514b096404238a0a693a11b6fd1a0: Status 404 returned error can't find the container with id 313d8815bb1add99da0de797472f575efc3514b096404238a0a693a11b6fd1a0 Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.479578 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.916126 4764 generic.go:334] "Generic (PLEG): container finished" podID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerID="464c4e18b5bd62a452e132a36f275fe3df47eae76ed1ac7cf20dd97cddb927b2" exitCode=0 Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.916172 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerDied","Data":"464c4e18b5bd62a452e132a36f275fe3df47eae76ed1ac7cf20dd97cddb927b2"} Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.916196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerStarted","Data":"313d8815bb1add99da0de797472f575efc3514b096404238a0a693a11b6fd1a0"} Dec 04 00:45:05 crc kubenswrapper[4764]: I1204 00:45:05.919153 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:45:06 crc kubenswrapper[4764]: I1204 00:45:06.929101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerStarted","Data":"9cc339e45017adc4e9be1a82cd0900527dae5f931b4c5ad14d9372e64197bbff"} Dec 04 00:45:07 crc kubenswrapper[4764]: I1204 00:45:07.942414 4764 generic.go:334] "Generic (PLEG): container finished" podID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerID="9cc339e45017adc4e9be1a82cd0900527dae5f931b4c5ad14d9372e64197bbff" exitCode=0 Dec 04 00:45:07 crc kubenswrapper[4764]: I1204 00:45:07.943439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerDied","Data":"9cc339e45017adc4e9be1a82cd0900527dae5f931b4c5ad14d9372e64197bbff"} Dec 04 00:45:08 crc kubenswrapper[4764]: I1204 00:45:08.955511 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerStarted","Data":"fbb1970556a2f187a2cdda4be79e4b03d4085baae28a76996f1b8c8678285e0f"} Dec 04 00:45:08 crc kubenswrapper[4764]: I1204 00:45:08.983323 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4drqq" podStartSLOduration=2.477546858 podStartE2EDuration="4.98329472s" podCreationTimestamp="2025-12-04 00:45:04 +0000 UTC" firstStartedPulling="2025-12-04 00:45:05.918934655 +0000 UTC m=+3841.680259066" lastFinishedPulling="2025-12-04 00:45:08.424682517 +0000 UTC m=+3844.186006928" observedRunningTime="2025-12-04 00:45:08.973987987 +0000 UTC m=+3844.735312438" watchObservedRunningTime="2025-12-04 00:45:08.98329472 +0000 UTC m=+3844.744619151" Dec 04 00:45:09 crc kubenswrapper[4764]: I1204 00:45:09.618945 4764 scope.go:117] "RemoveContainer" containerID="016a7a62e13733816d8435cb64f114dba1e801e641a5f17e88eaee4b21fab7e7" Dec 04 00:45:15 crc kubenswrapper[4764]: I1204 00:45:15.057904 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:15 crc kubenswrapper[4764]: I1204 00:45:15.058377 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:15 crc kubenswrapper[4764]: I1204 00:45:15.099701 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:16 crc kubenswrapper[4764]: I1204 00:45:16.110044 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:16 crc kubenswrapper[4764]: I1204 00:45:16.157153 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:18 crc kubenswrapper[4764]: I1204 00:45:18.018256 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4drqq" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="registry-server" containerID="cri-o://fbb1970556a2f187a2cdda4be79e4b03d4085baae28a76996f1b8c8678285e0f" gracePeriod=2 Dec 04 00:45:20 crc kubenswrapper[4764]: I1204 00:45:20.869349 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:45:20 crc kubenswrapper[4764]: I1204 00:45:20.869703 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:45:20 crc kubenswrapper[4764]: I1204 00:45:20.869783 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:45:20 crc kubenswrapper[4764]: I1204 00:45:20.870531 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:45:20 crc kubenswrapper[4764]: I1204 00:45:20.870603 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" gracePeriod=600 Dec 04 00:45:21 crc kubenswrapper[4764]: E1204 00:45:21.021817 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.044854 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" exitCode=0 Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.044943 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f"} Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.045008 4764 scope.go:117] "RemoveContainer" containerID="36b011b29c1a112ebb21c7ea247458c1d61fbd59347e10f58cd77f84ee48e337" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.045833 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:45:21 crc kubenswrapper[4764]: E1204 00:45:21.046163 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.051234 4764 generic.go:334] "Generic (PLEG): container finished" podID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerID="fbb1970556a2f187a2cdda4be79e4b03d4085baae28a76996f1b8c8678285e0f" exitCode=0 Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.051290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerDied","Data":"fbb1970556a2f187a2cdda4be79e4b03d4085baae28a76996f1b8c8678285e0f"} Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.233749 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.323086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content\") pod \"3d756ba6-598c-40dd-976e-5e55914e93c7\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.323147 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities\") pod \"3d756ba6-598c-40dd-976e-5e55914e93c7\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.323174 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7nws\" (UniqueName: \"kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws\") pod \"3d756ba6-598c-40dd-976e-5e55914e93c7\" (UID: \"3d756ba6-598c-40dd-976e-5e55914e93c7\") " Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.324433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities" (OuterVolumeSpecName: "utilities") pod "3d756ba6-598c-40dd-976e-5e55914e93c7" (UID: "3d756ba6-598c-40dd-976e-5e55914e93c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.327984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws" (OuterVolumeSpecName: "kube-api-access-c7nws") pod "3d756ba6-598c-40dd-976e-5e55914e93c7" (UID: "3d756ba6-598c-40dd-976e-5e55914e93c7"). InnerVolumeSpecName "kube-api-access-c7nws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.424955 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.424996 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7nws\" (UniqueName: \"kubernetes.io/projected/3d756ba6-598c-40dd-976e-5e55914e93c7-kube-api-access-c7nws\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.439870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d756ba6-598c-40dd-976e-5e55914e93c7" (UID: "3d756ba6-598c-40dd-976e-5e55914e93c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:45:21 crc kubenswrapper[4764]: I1204 00:45:21.526070 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d756ba6-598c-40dd-976e-5e55914e93c7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.062657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4drqq" event={"ID":"3d756ba6-598c-40dd-976e-5e55914e93c7","Type":"ContainerDied","Data":"313d8815bb1add99da0de797472f575efc3514b096404238a0a693a11b6fd1a0"} Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.063522 4764 scope.go:117] "RemoveContainer" containerID="fbb1970556a2f187a2cdda4be79e4b03d4085baae28a76996f1b8c8678285e0f" Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.062752 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4drqq" Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.084078 4764 scope.go:117] "RemoveContainer" containerID="9cc339e45017adc4e9be1a82cd0900527dae5f931b4c5ad14d9372e64197bbff" Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.093460 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.102961 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4drqq"] Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.117027 4764 scope.go:117] "RemoveContainer" containerID="464c4e18b5bd62a452e132a36f275fe3df47eae76ed1ac7cf20dd97cddb927b2" Dec 04 00:45:22 crc kubenswrapper[4764]: I1204 00:45:22.556668 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" path="/var/lib/kubelet/pods/3d756ba6-598c-40dd-976e-5e55914e93c7/volumes" Dec 04 00:45:31 crc kubenswrapper[4764]: I1204 00:45:31.546237 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:45:31 crc kubenswrapper[4764]: E1204 00:45:31.547007 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:45:46 crc kubenswrapper[4764]: I1204 00:45:46.546570 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:45:46 crc kubenswrapper[4764]: E1204 00:45:46.547821 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:45:59 crc kubenswrapper[4764]: I1204 00:45:59.546122 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:45:59 crc kubenswrapper[4764]: E1204 00:45:59.549104 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.701865 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:05 crc kubenswrapper[4764]: E1204 00:46:05.707182 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="registry-server" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.707225 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="registry-server" Dec 04 00:46:05 crc kubenswrapper[4764]: E1204 00:46:05.707252 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="extract-content" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.707266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="extract-content" Dec 04 00:46:05 crc kubenswrapper[4764]: E1204 00:46:05.707286 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="extract-utilities" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.707300 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="extract-utilities" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.707624 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d756ba6-598c-40dd-976e-5e55914e93c7" containerName="registry-server" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.709931 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.722808 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.816114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2hf\" (UniqueName: \"kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.816426 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.816565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.917974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.918260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.918420 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr2hf\" (UniqueName: \"kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.918535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.919036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:05 crc kubenswrapper[4764]: I1204 00:46:05.950233 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr2hf\" (UniqueName: \"kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf\") pod \"community-operators-gqnql\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:06 crc kubenswrapper[4764]: I1204 00:46:06.046219 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:06 crc kubenswrapper[4764]: I1204 00:46:06.570357 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:07 crc kubenswrapper[4764]: I1204 00:46:07.462821 4764 generic.go:334] "Generic (PLEG): container finished" podID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerID="b45eaca8943e6d48fa2bfefecd802215e2425b0c6302fa03b20b2b3b3881261f" exitCode=0 Dec 04 00:46:07 crc kubenswrapper[4764]: I1204 00:46:07.462868 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerDied","Data":"b45eaca8943e6d48fa2bfefecd802215e2425b0c6302fa03b20b2b3b3881261f"} Dec 04 00:46:07 crc kubenswrapper[4764]: I1204 00:46:07.463206 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerStarted","Data":"6d86e369ee99412ed350b61d312d2257232fcbb01420532d828798b493611a6c"} Dec 04 00:46:08 crc kubenswrapper[4764]: I1204 00:46:08.474447 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerStarted","Data":"5a313842266469d8b03512c467e557253b150e7c31624deed7f89b8c000d797b"} Dec 04 00:46:09 crc kubenswrapper[4764]: I1204 00:46:09.488527 4764 generic.go:334] "Generic (PLEG): container finished" podID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerID="5a313842266469d8b03512c467e557253b150e7c31624deed7f89b8c000d797b" exitCode=0 Dec 04 00:46:09 crc kubenswrapper[4764]: I1204 00:46:09.488631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerDied","Data":"5a313842266469d8b03512c467e557253b150e7c31624deed7f89b8c000d797b"} Dec 04 00:46:10 crc kubenswrapper[4764]: I1204 00:46:10.500622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerStarted","Data":"862a38ee305bdce59da64e50875d46f05b28c034f508df17e0825d468363c85f"} Dec 04 00:46:10 crc kubenswrapper[4764]: I1204 00:46:10.529812 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gqnql" podStartSLOduration=3.020591623 podStartE2EDuration="5.52978984s" podCreationTimestamp="2025-12-04 00:46:05 +0000 UTC" firstStartedPulling="2025-12-04 00:46:07.464854871 +0000 UTC m=+3903.226179282" lastFinishedPulling="2025-12-04 00:46:09.974053078 +0000 UTC m=+3905.735377499" observedRunningTime="2025-12-04 00:46:10.526543429 +0000 UTC m=+3906.287867870" watchObservedRunningTime="2025-12-04 00:46:10.52978984 +0000 UTC m=+3906.291114491" Dec 04 00:46:11 crc kubenswrapper[4764]: I1204 00:46:11.546279 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:46:11 crc kubenswrapper[4764]: E1204 00:46:11.547280 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:46:16 crc kubenswrapper[4764]: I1204 00:46:16.046882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:16 crc kubenswrapper[4764]: I1204 00:46:16.047266 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:16 crc kubenswrapper[4764]: I1204 00:46:16.094360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:16 crc kubenswrapper[4764]: I1204 00:46:16.590024 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:16 crc kubenswrapper[4764]: I1204 00:46:16.641307 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:18 crc kubenswrapper[4764]: I1204 00:46:18.561753 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gqnql" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="registry-server" containerID="cri-o://862a38ee305bdce59da64e50875d46f05b28c034f508df17e0825d468363c85f" gracePeriod=2 Dec 04 00:46:19 crc kubenswrapper[4764]: I1204 00:46:19.578225 4764 generic.go:334] "Generic (PLEG): container finished" podID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerID="862a38ee305bdce59da64e50875d46f05b28c034f508df17e0825d468363c85f" exitCode=0 Dec 04 00:46:19 crc kubenswrapper[4764]: I1204 00:46:19.578292 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerDied","Data":"862a38ee305bdce59da64e50875d46f05b28c034f508df17e0825d468363c85f"} Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.137080 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.247203 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr2hf\" (UniqueName: \"kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf\") pod \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.247287 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content\") pod \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.247346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities\") pod \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\" (UID: \"2e3510e8-b15b-4f66-9e98-7f451f1fdf14\") " Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.248853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities" (OuterVolumeSpecName: "utilities") pod "2e3510e8-b15b-4f66-9e98-7f451f1fdf14" (UID: "2e3510e8-b15b-4f66-9e98-7f451f1fdf14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.259926 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf" (OuterVolumeSpecName: "kube-api-access-tr2hf") pod "2e3510e8-b15b-4f66-9e98-7f451f1fdf14" (UID: "2e3510e8-b15b-4f66-9e98-7f451f1fdf14"). InnerVolumeSpecName "kube-api-access-tr2hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.304299 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e3510e8-b15b-4f66-9e98-7f451f1fdf14" (UID: "2e3510e8-b15b-4f66-9e98-7f451f1fdf14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.348993 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.349033 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr2hf\" (UniqueName: \"kubernetes.io/projected/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-kube-api-access-tr2hf\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.349048 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e3510e8-b15b-4f66-9e98-7f451f1fdf14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.589294 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gqnql" event={"ID":"2e3510e8-b15b-4f66-9e98-7f451f1fdf14","Type":"ContainerDied","Data":"6d86e369ee99412ed350b61d312d2257232fcbb01420532d828798b493611a6c"} Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.589358 4764 scope.go:117] "RemoveContainer" containerID="862a38ee305bdce59da64e50875d46f05b28c034f508df17e0825d468363c85f" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.589500 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gqnql" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.627045 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.634277 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gqnql"] Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.645688 4764 scope.go:117] "RemoveContainer" containerID="5a313842266469d8b03512c467e557253b150e7c31624deed7f89b8c000d797b" Dec 04 00:46:20 crc kubenswrapper[4764]: I1204 00:46:20.670977 4764 scope.go:117] "RemoveContainer" containerID="b45eaca8943e6d48fa2bfefecd802215e2425b0c6302fa03b20b2b3b3881261f" Dec 04 00:46:22 crc kubenswrapper[4764]: I1204 00:46:22.554621 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" path="/var/lib/kubelet/pods/2e3510e8-b15b-4f66-9e98-7f451f1fdf14/volumes" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.474515 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:24 crc kubenswrapper[4764]: E1204 00:46:24.475107 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="registry-server" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.475120 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="registry-server" Dec 04 00:46:24 crc kubenswrapper[4764]: E1204 00:46:24.475131 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="extract-utilities" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.475136 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="extract-utilities" Dec 04 00:46:24 crc kubenswrapper[4764]: E1204 00:46:24.475144 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="extract-content" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.475150 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="extract-content" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.475307 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e3510e8-b15b-4f66-9e98-7f451f1fdf14" containerName="registry-server" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.476288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.490387 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.637525 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.637610 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.637981 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zgz\" (UniqueName: \"kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.740013 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zgz\" (UniqueName: \"kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.740124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.740161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.740829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.741030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.772064 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zgz\" (UniqueName: \"kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz\") pod \"certified-operators-kdp2p\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:24 crc kubenswrapper[4764]: I1204 00:46:24.820204 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:25 crc kubenswrapper[4764]: I1204 00:46:25.079232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:25 crc kubenswrapper[4764]: I1204 00:46:25.545336 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:46:25 crc kubenswrapper[4764]: E1204 00:46:25.545908 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:46:25 crc kubenswrapper[4764]: I1204 00:46:25.638960 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerID="38b19042bfcc07b3bf76ba581e528bf4ede6900e5160d781fafc4d4d5ba5c88a" exitCode=0 Dec 04 00:46:25 crc kubenswrapper[4764]: I1204 00:46:25.639021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerDied","Data":"38b19042bfcc07b3bf76ba581e528bf4ede6900e5160d781fafc4d4d5ba5c88a"} Dec 04 00:46:25 crc kubenswrapper[4764]: I1204 00:46:25.639061 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerStarted","Data":"b1baa739af6bf15adde234aca8a65c7a96b8ff5e970891c6fe8ba0196b69f400"} Dec 04 00:46:26 crc kubenswrapper[4764]: I1204 00:46:26.650228 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerID="9723db4c7ce2712a078beb5976ed668f11b93ef73157f4b8c4af9560105fa7b6" exitCode=0 Dec 04 00:46:26 crc kubenswrapper[4764]: I1204 00:46:26.650562 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerDied","Data":"9723db4c7ce2712a078beb5976ed668f11b93ef73157f4b8c4af9560105fa7b6"} Dec 04 00:46:27 crc kubenswrapper[4764]: I1204 00:46:27.659354 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerStarted","Data":"212f2afd76bccda9bf341353ce3eb62be6dc73698cae2494ab96f4bd69121f4e"} Dec 04 00:46:27 crc kubenswrapper[4764]: I1204 00:46:27.683792 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kdp2p" podStartSLOduration=2.269629555 podStartE2EDuration="3.683770499s" podCreationTimestamp="2025-12-04 00:46:24 +0000 UTC" firstStartedPulling="2025-12-04 00:46:25.641372723 +0000 UTC m=+3921.402697174" lastFinishedPulling="2025-12-04 00:46:27.055513707 +0000 UTC m=+3922.816838118" observedRunningTime="2025-12-04 00:46:27.680317393 +0000 UTC m=+3923.441641804" watchObservedRunningTime="2025-12-04 00:46:27.683770499 +0000 UTC m=+3923.445094910" Dec 04 00:46:34 crc kubenswrapper[4764]: I1204 00:46:34.821294 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:34 crc kubenswrapper[4764]: I1204 00:46:34.822148 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:34 crc kubenswrapper[4764]: I1204 00:46:34.898610 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:35 crc kubenswrapper[4764]: I1204 00:46:35.807419 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:36 crc kubenswrapper[4764]: I1204 00:46:36.139981 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:36 crc kubenswrapper[4764]: I1204 00:46:36.545740 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:46:36 crc kubenswrapper[4764]: E1204 00:46:36.545949 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:46:37 crc kubenswrapper[4764]: I1204 00:46:37.735242 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kdp2p" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="registry-server" containerID="cri-o://212f2afd76bccda9bf341353ce3eb62be6dc73698cae2494ab96f4bd69121f4e" gracePeriod=2 Dec 04 00:46:38 crc kubenswrapper[4764]: I1204 00:46:38.748482 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerID="212f2afd76bccda9bf341353ce3eb62be6dc73698cae2494ab96f4bd69121f4e" exitCode=0 Dec 04 00:46:38 crc kubenswrapper[4764]: I1204 00:46:38.748930 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerDied","Data":"212f2afd76bccda9bf341353ce3eb62be6dc73698cae2494ab96f4bd69121f4e"} Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.481637 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.614821 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content\") pod \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.614874 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5zgz\" (UniqueName: \"kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz\") pod \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.615461 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities\") pod \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\" (UID: \"bbcb89d7-e51a-49b4-910c-5c230f66c21e\") " Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.616413 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities" (OuterVolumeSpecName: "utilities") pod "bbcb89d7-e51a-49b4-910c-5c230f66c21e" (UID: "bbcb89d7-e51a-49b4-910c-5c230f66c21e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.623138 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz" (OuterVolumeSpecName: "kube-api-access-q5zgz") pod "bbcb89d7-e51a-49b4-910c-5c230f66c21e" (UID: "bbcb89d7-e51a-49b4-910c-5c230f66c21e"). InnerVolumeSpecName "kube-api-access-q5zgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.687907 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbcb89d7-e51a-49b4-910c-5c230f66c21e" (UID: "bbcb89d7-e51a-49b4-910c-5c230f66c21e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.717303 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.717340 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbcb89d7-e51a-49b4-910c-5c230f66c21e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.717355 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5zgz\" (UniqueName: \"kubernetes.io/projected/bbcb89d7-e51a-49b4-910c-5c230f66c21e-kube-api-access-q5zgz\") on node \"crc\" DevicePath \"\"" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.757965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kdp2p" event={"ID":"bbcb89d7-e51a-49b4-910c-5c230f66c21e","Type":"ContainerDied","Data":"b1baa739af6bf15adde234aca8a65c7a96b8ff5e970891c6fe8ba0196b69f400"} Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.758035 4764 scope.go:117] "RemoveContainer" containerID="212f2afd76bccda9bf341353ce3eb62be6dc73698cae2494ab96f4bd69121f4e" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.758032 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kdp2p" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.791149 4764 scope.go:117] "RemoveContainer" containerID="9723db4c7ce2712a078beb5976ed668f11b93ef73157f4b8c4af9560105fa7b6" Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.798972 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.807607 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kdp2p"] Dec 04 00:46:39 crc kubenswrapper[4764]: I1204 00:46:39.821340 4764 scope.go:117] "RemoveContainer" containerID="38b19042bfcc07b3bf76ba581e528bf4ede6900e5160d781fafc4d4d5ba5c88a" Dec 04 00:46:40 crc kubenswrapper[4764]: I1204 00:46:40.557984 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" path="/var/lib/kubelet/pods/bbcb89d7-e51a-49b4-910c-5c230f66c21e/volumes" Dec 04 00:46:47 crc kubenswrapper[4764]: I1204 00:46:47.546093 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:46:47 crc kubenswrapper[4764]: E1204 00:46:47.549222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:46:58 crc kubenswrapper[4764]: I1204 00:46:58.546005 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:46:58 crc kubenswrapper[4764]: E1204 00:46:58.546950 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:47:11 crc kubenswrapper[4764]: I1204 00:47:11.546458 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:47:11 crc kubenswrapper[4764]: E1204 00:47:11.547620 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:47:23 crc kubenswrapper[4764]: I1204 00:47:23.545823 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:47:23 crc kubenswrapper[4764]: E1204 00:47:23.546508 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:47:34 crc kubenswrapper[4764]: I1204 00:47:34.556588 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:47:34 crc kubenswrapper[4764]: E1204 00:47:34.557875 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:47:45 crc kubenswrapper[4764]: I1204 00:47:45.545931 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:47:45 crc kubenswrapper[4764]: E1204 00:47:45.546897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:48:00 crc kubenswrapper[4764]: I1204 00:48:00.546995 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:48:00 crc kubenswrapper[4764]: E1204 00:48:00.548065 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:48:13 crc kubenswrapper[4764]: I1204 00:48:13.546856 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:48:13 crc kubenswrapper[4764]: E1204 00:48:13.547947 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:48:24 crc kubenswrapper[4764]: I1204 00:48:24.551543 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:48:24 crc kubenswrapper[4764]: E1204 00:48:24.552677 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:48:38 crc kubenswrapper[4764]: I1204 00:48:38.546174 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:48:38 crc kubenswrapper[4764]: E1204 00:48:38.547261 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:48:49 crc kubenswrapper[4764]: I1204 00:48:49.545995 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:48:49 crc kubenswrapper[4764]: E1204 00:48:49.546915 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:49:03 crc kubenswrapper[4764]: I1204 00:49:03.545894 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:49:03 crc kubenswrapper[4764]: E1204 00:49:03.546686 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:49:15 crc kubenswrapper[4764]: I1204 00:49:15.546469 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:49:15 crc kubenswrapper[4764]: E1204 00:49:15.547522 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:49:27 crc kubenswrapper[4764]: I1204 00:49:27.545445 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:49:27 crc kubenswrapper[4764]: E1204 00:49:27.546171 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:49:39 crc kubenswrapper[4764]: I1204 00:49:39.546571 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:49:39 crc kubenswrapper[4764]: E1204 00:49:39.547929 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:49:53 crc kubenswrapper[4764]: I1204 00:49:53.546240 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:49:53 crc kubenswrapper[4764]: E1204 00:49:53.547353 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:50:04 crc kubenswrapper[4764]: I1204 00:50:04.550297 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:50:04 crc kubenswrapper[4764]: E1204 00:50:04.551182 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:50:17 crc kubenswrapper[4764]: I1204 00:50:17.546000 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:50:17 crc kubenswrapper[4764]: E1204 00:50:17.547199 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:50:31 crc kubenswrapper[4764]: I1204 00:50:31.545588 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:50:31 crc kubenswrapper[4764]: I1204 00:50:31.782860 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59"} Dec 04 00:52:50 crc kubenswrapper[4764]: I1204 00:52:50.868531 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:52:50 crc kubenswrapper[4764]: I1204 00:52:50.869071 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:53:20 crc kubenswrapper[4764]: I1204 00:53:20.868867 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:53:20 crc kubenswrapper[4764]: I1204 00:53:20.869543 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:53:50 crc kubenswrapper[4764]: I1204 00:53:50.869402 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:53:50 crc kubenswrapper[4764]: I1204 00:53:50.870061 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:53:50 crc kubenswrapper[4764]: I1204 00:53:50.870129 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:53:50 crc kubenswrapper[4764]: I1204 00:53:50.871037 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:53:50 crc kubenswrapper[4764]: I1204 00:53:50.871136 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59" gracePeriod=600 Dec 04 00:53:51 crc kubenswrapper[4764]: I1204 00:53:51.847015 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59" exitCode=0 Dec 04 00:53:51 crc kubenswrapper[4764]: I1204 00:53:51.847073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59"} Dec 04 00:53:51 crc kubenswrapper[4764]: I1204 00:53:51.847617 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2"} Dec 04 00:53:51 crc kubenswrapper[4764]: I1204 00:53:51.847638 4764 scope.go:117] "RemoveContainer" containerID="04ad706c6e96c7087df3e75b195a59ed80180bf453251a823acea14c09065a8f" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.776920 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:30 crc kubenswrapper[4764]: E1204 00:55:30.778641 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="registry-server" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.778667 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="registry-server" Dec 04 00:55:30 crc kubenswrapper[4764]: E1204 00:55:30.778708 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="extract-utilities" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.778753 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="extract-utilities" Dec 04 00:55:30 crc kubenswrapper[4764]: E1204 00:55:30.778791 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="extract-content" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.778808 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="extract-content" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.779125 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbcb89d7-e51a-49b4-910c-5c230f66c21e" containerName="registry-server" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.781212 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.805864 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.913058 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.913108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:30 crc kubenswrapper[4764]: I1204 00:55:30.913150 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgg2l\" (UniqueName: \"kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.014841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.014894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.014939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgg2l\" (UniqueName: \"kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.015509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.015592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.038114 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgg2l\" (UniqueName: \"kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l\") pod \"redhat-operators-gq4fn\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.148936 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.557802 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.739063 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9506196-576e-414f-9b59-82cdd7123466" containerID="b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5" exitCode=0 Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.739112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerDied","Data":"b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5"} Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.739141 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerStarted","Data":"b87fbc24e69c7314bba9233a384c2d405dab22b23b8089dad4e389e4e8b76c40"} Dec 04 00:55:31 crc kubenswrapper[4764]: I1204 00:55:31.741291 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 00:55:32 crc kubenswrapper[4764]: I1204 00:55:32.752300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerStarted","Data":"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3"} Dec 04 00:55:33 crc kubenswrapper[4764]: I1204 00:55:33.765095 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9506196-576e-414f-9b59-82cdd7123466" containerID="e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3" exitCode=0 Dec 04 00:55:33 crc kubenswrapper[4764]: I1204 00:55:33.765239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerDied","Data":"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3"} Dec 04 00:55:34 crc kubenswrapper[4764]: I1204 00:55:34.779211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerStarted","Data":"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2"} Dec 04 00:55:34 crc kubenswrapper[4764]: I1204 00:55:34.806254 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gq4fn" podStartSLOduration=2.227785032 podStartE2EDuration="4.806227067s" podCreationTimestamp="2025-12-04 00:55:30 +0000 UTC" firstStartedPulling="2025-12-04 00:55:31.740982093 +0000 UTC m=+4467.502306494" lastFinishedPulling="2025-12-04 00:55:34.319424078 +0000 UTC m=+4470.080748529" observedRunningTime="2025-12-04 00:55:34.805639633 +0000 UTC m=+4470.566964094" watchObservedRunningTime="2025-12-04 00:55:34.806227067 +0000 UTC m=+4470.567551528" Dec 04 00:55:41 crc kubenswrapper[4764]: I1204 00:55:41.149800 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:41 crc kubenswrapper[4764]: I1204 00:55:41.150425 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:42 crc kubenswrapper[4764]: I1204 00:55:42.216768 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gq4fn" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="registry-server" probeResult="failure" output=< Dec 04 00:55:42 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 00:55:42 crc kubenswrapper[4764]: > Dec 04 00:55:45 crc kubenswrapper[4764]: I1204 00:55:45.941364 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:55:45 crc kubenswrapper[4764]: I1204 00:55:45.943897 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:45 crc kubenswrapper[4764]: I1204 00:55:45.967133 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.075236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9zgn\" (UniqueName: \"kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.075352 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.075554 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.176422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.176505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9zgn\" (UniqueName: \"kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.176553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.176988 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.177208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.209959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9zgn\" (UniqueName: \"kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn\") pod \"redhat-marketplace-l8cwj\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.267110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.703849 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:55:46 crc kubenswrapper[4764]: W1204 00:55:46.710138 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb179638b_abbe_4ef6_8a13_d0fe27369651.slice/crio-619c866ac844aea3f40b78e849579320c1e2b32f92182f84e0b3d3f91352f5cc WatchSource:0}: Error finding container 619c866ac844aea3f40b78e849579320c1e2b32f92182f84e0b3d3f91352f5cc: Status 404 returned error can't find the container with id 619c866ac844aea3f40b78e849579320c1e2b32f92182f84e0b3d3f91352f5cc Dec 04 00:55:46 crc kubenswrapper[4764]: I1204 00:55:46.889453 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerStarted","Data":"619c866ac844aea3f40b78e849579320c1e2b32f92182f84e0b3d3f91352f5cc"} Dec 04 00:55:47 crc kubenswrapper[4764]: I1204 00:55:47.906062 4764 generic.go:334] "Generic (PLEG): container finished" podID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerID="11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2" exitCode=0 Dec 04 00:55:47 crc kubenswrapper[4764]: I1204 00:55:47.906126 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerDied","Data":"11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2"} Dec 04 00:55:49 crc kubenswrapper[4764]: I1204 00:55:49.929398 4764 generic.go:334] "Generic (PLEG): container finished" podID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerID="c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c" exitCode=0 Dec 04 00:55:49 crc kubenswrapper[4764]: I1204 00:55:49.929470 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerDied","Data":"c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c"} Dec 04 00:55:50 crc kubenswrapper[4764]: I1204 00:55:50.941474 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerStarted","Data":"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509"} Dec 04 00:55:50 crc kubenswrapper[4764]: I1204 00:55:50.973297 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8cwj" podStartSLOduration=3.381505536 podStartE2EDuration="5.973257908s" podCreationTimestamp="2025-12-04 00:55:45 +0000 UTC" firstStartedPulling="2025-12-04 00:55:47.90861934 +0000 UTC m=+4483.669943791" lastFinishedPulling="2025-12-04 00:55:50.500371712 +0000 UTC m=+4486.261696163" observedRunningTime="2025-12-04 00:55:50.959282284 +0000 UTC m=+4486.720606735" watchObservedRunningTime="2025-12-04 00:55:50.973257908 +0000 UTC m=+4486.734582349" Dec 04 00:55:51 crc kubenswrapper[4764]: I1204 00:55:51.212682 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:51 crc kubenswrapper[4764]: I1204 00:55:51.281083 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:52 crc kubenswrapper[4764]: I1204 00:55:52.516536 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:52 crc kubenswrapper[4764]: I1204 00:55:52.958999 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gq4fn" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="registry-server" containerID="cri-o://3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2" gracePeriod=2 Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.520108 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.601960 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities\") pod \"c9506196-576e-414f-9b59-82cdd7123466\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.602131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content\") pod \"c9506196-576e-414f-9b59-82cdd7123466\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.602202 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgg2l\" (UniqueName: \"kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l\") pod \"c9506196-576e-414f-9b59-82cdd7123466\" (UID: \"c9506196-576e-414f-9b59-82cdd7123466\") " Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.603463 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities" (OuterVolumeSpecName: "utilities") pod "c9506196-576e-414f-9b59-82cdd7123466" (UID: "c9506196-576e-414f-9b59-82cdd7123466"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.613909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l" (OuterVolumeSpecName: "kube-api-access-fgg2l") pod "c9506196-576e-414f-9b59-82cdd7123466" (UID: "c9506196-576e-414f-9b59-82cdd7123466"). InnerVolumeSpecName "kube-api-access-fgg2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.705402 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.705452 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgg2l\" (UniqueName: \"kubernetes.io/projected/c9506196-576e-414f-9b59-82cdd7123466-kube-api-access-fgg2l\") on node \"crc\" DevicePath \"\"" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.759257 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9506196-576e-414f-9b59-82cdd7123466" (UID: "c9506196-576e-414f-9b59-82cdd7123466"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.807478 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9506196-576e-414f-9b59-82cdd7123466-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.970077 4764 generic.go:334] "Generic (PLEG): container finished" podID="c9506196-576e-414f-9b59-82cdd7123466" containerID="3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2" exitCode=0 Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.970157 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gq4fn" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.970145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerDied","Data":"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2"} Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.970304 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gq4fn" event={"ID":"c9506196-576e-414f-9b59-82cdd7123466","Type":"ContainerDied","Data":"b87fbc24e69c7314bba9233a384c2d405dab22b23b8089dad4e389e4e8b76c40"} Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.970356 4764 scope.go:117] "RemoveContainer" containerID="3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2" Dec 04 00:55:53 crc kubenswrapper[4764]: I1204 00:55:53.996498 4764 scope.go:117] "RemoveContainer" containerID="e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.012971 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.020065 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gq4fn"] Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.043273 4764 scope.go:117] "RemoveContainer" containerID="b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.073247 4764 scope.go:117] "RemoveContainer" containerID="3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2" Dec 04 00:55:54 crc kubenswrapper[4764]: E1204 00:55:54.073994 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2\": container with ID starting with 3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2 not found: ID does not exist" containerID="3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.074049 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2"} err="failed to get container status \"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2\": rpc error: code = NotFound desc = could not find container \"3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2\": container with ID starting with 3ce00a998aedc837bf29c2b2f7ee6a49e7501b8b9311c16fb58d58bccc8f23b2 not found: ID does not exist" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.074080 4764 scope.go:117] "RemoveContainer" containerID="e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3" Dec 04 00:55:54 crc kubenswrapper[4764]: E1204 00:55:54.074528 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3\": container with ID starting with e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3 not found: ID does not exist" containerID="e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.074556 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3"} err="failed to get container status \"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3\": rpc error: code = NotFound desc = could not find container \"e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3\": container with ID starting with e23fb3a825bf8f9cd1a4f912818bcb3872e3b11969d29c941273c15677c653d3 not found: ID does not exist" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.074574 4764 scope.go:117] "RemoveContainer" containerID="b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5" Dec 04 00:55:54 crc kubenswrapper[4764]: E1204 00:55:54.074975 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5\": container with ID starting with b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5 not found: ID does not exist" containerID="b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.075012 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5"} err="failed to get container status \"b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5\": rpc error: code = NotFound desc = could not find container \"b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5\": container with ID starting with b6fafd6f4382adfc4d25571df419e508e525bbce47671410d3385f386403caa5 not found: ID does not exist" Dec 04 00:55:54 crc kubenswrapper[4764]: I1204 00:55:54.560790 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9506196-576e-414f-9b59-82cdd7123466" path="/var/lib/kubelet/pods/c9506196-576e-414f-9b59-82cdd7123466/volumes" Dec 04 00:55:56 crc kubenswrapper[4764]: I1204 00:55:56.267588 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:56 crc kubenswrapper[4764]: I1204 00:55:56.267675 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:56 crc kubenswrapper[4764]: I1204 00:55:56.343835 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:57 crc kubenswrapper[4764]: I1204 00:55:57.071857 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:57 crc kubenswrapper[4764]: I1204 00:55:57.919236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.017309 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8cwj" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="registry-server" containerID="cri-o://24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509" gracePeriod=2 Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.538350 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.703391 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities\") pod \"b179638b-abbe-4ef6-8a13-d0fe27369651\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.703523 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content\") pod \"b179638b-abbe-4ef6-8a13-d0fe27369651\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.703690 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9zgn\" (UniqueName: \"kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn\") pod \"b179638b-abbe-4ef6-8a13-d0fe27369651\" (UID: \"b179638b-abbe-4ef6-8a13-d0fe27369651\") " Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.705481 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities" (OuterVolumeSpecName: "utilities") pod "b179638b-abbe-4ef6-8a13-d0fe27369651" (UID: "b179638b-abbe-4ef6-8a13-d0fe27369651"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.712921 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn" (OuterVolumeSpecName: "kube-api-access-c9zgn") pod "b179638b-abbe-4ef6-8a13-d0fe27369651" (UID: "b179638b-abbe-4ef6-8a13-d0fe27369651"). InnerVolumeSpecName "kube-api-access-c9zgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.734246 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b179638b-abbe-4ef6-8a13-d0fe27369651" (UID: "b179638b-abbe-4ef6-8a13-d0fe27369651"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.805987 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9zgn\" (UniqueName: \"kubernetes.io/projected/b179638b-abbe-4ef6-8a13-d0fe27369651-kube-api-access-c9zgn\") on node \"crc\" DevicePath \"\"" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.806033 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:55:59 crc kubenswrapper[4764]: I1204 00:55:59.806073 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b179638b-abbe-4ef6-8a13-d0fe27369651-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.027975 4764 generic.go:334] "Generic (PLEG): container finished" podID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerID="24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509" exitCode=0 Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.028020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerDied","Data":"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509"} Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.028049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8cwj" event={"ID":"b179638b-abbe-4ef6-8a13-d0fe27369651","Type":"ContainerDied","Data":"619c866ac844aea3f40b78e849579320c1e2b32f92182f84e0b3d3f91352f5cc"} Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.028070 4764 scope.go:117] "RemoveContainer" containerID="24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.028089 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8cwj" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.062445 4764 scope.go:117] "RemoveContainer" containerID="c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.066425 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.073764 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8cwj"] Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.103858 4764 scope.go:117] "RemoveContainer" containerID="11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.123405 4764 scope.go:117] "RemoveContainer" containerID="24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509" Dec 04 00:56:00 crc kubenswrapper[4764]: E1204 00:56:00.123955 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509\": container with ID starting with 24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509 not found: ID does not exist" containerID="24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.123997 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509"} err="failed to get container status \"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509\": rpc error: code = NotFound desc = could not find container \"24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509\": container with ID starting with 24ec9d76f060eeeb642568e51222e6433e9f3969151f82ce3a7cd0e3ab532509 not found: ID does not exist" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.124031 4764 scope.go:117] "RemoveContainer" containerID="c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c" Dec 04 00:56:00 crc kubenswrapper[4764]: E1204 00:56:00.124523 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c\": container with ID starting with c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c not found: ID does not exist" containerID="c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.124569 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c"} err="failed to get container status \"c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c\": rpc error: code = NotFound desc = could not find container \"c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c\": container with ID starting with c791a8d19dd2042a05d8262adc3579b208e5167dd1657969417d3cc257551f6c not found: ID does not exist" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.124603 4764 scope.go:117] "RemoveContainer" containerID="11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2" Dec 04 00:56:00 crc kubenswrapper[4764]: E1204 00:56:00.125231 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2\": container with ID starting with 11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2 not found: ID does not exist" containerID="11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.125278 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2"} err="failed to get container status \"11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2\": rpc error: code = NotFound desc = could not find container \"11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2\": container with ID starting with 11284b392311aa23133ee9e84540eee15a58a99870c3b573e06b93f84558e8c2 not found: ID does not exist" Dec 04 00:56:00 crc kubenswrapper[4764]: I1204 00:56:00.563308 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" path="/var/lib/kubelet/pods/b179638b-abbe-4ef6-8a13-d0fe27369651/volumes" Dec 04 00:56:20 crc kubenswrapper[4764]: I1204 00:56:20.869747 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:56:20 crc kubenswrapper[4764]: I1204 00:56:20.870418 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.798776 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.799976 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="extract-content" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800001 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="extract-content" Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.800035 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="extract-utilities" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800047 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="extract-utilities" Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.800078 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800108 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.800123 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800134 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.800156 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="extract-content" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800164 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="extract-content" Dec 04 00:56:30 crc kubenswrapper[4764]: E1204 00:56:30.800178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="extract-utilities" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800185 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="extract-utilities" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800351 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b179638b-abbe-4ef6-8a13-d0fe27369651" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.800375 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9506196-576e-414f-9b59-82cdd7123466" containerName="registry-server" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.801970 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.831285 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.875675 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.875779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.875946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj9h6\" (UniqueName: \"kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.976894 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.977349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj9h6\" (UniqueName: \"kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.977400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.977532 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:30 crc kubenswrapper[4764]: I1204 00:56:30.977933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:31 crc kubenswrapper[4764]: I1204 00:56:31.004670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj9h6\" (UniqueName: \"kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6\") pod \"community-operators-qhsbd\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:31 crc kubenswrapper[4764]: I1204 00:56:31.132418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:31 crc kubenswrapper[4764]: I1204 00:56:31.657872 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:32 crc kubenswrapper[4764]: I1204 00:56:32.328265 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerID="3fd020c95d21d8845192fbc652ff649f18d90a5e3caabc57a5a8d7ff32fcffb2" exitCode=0 Dec 04 00:56:32 crc kubenswrapper[4764]: I1204 00:56:32.328319 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerDied","Data":"3fd020c95d21d8845192fbc652ff649f18d90a5e3caabc57a5a8d7ff32fcffb2"} Dec 04 00:56:32 crc kubenswrapper[4764]: I1204 00:56:32.328625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerStarted","Data":"674a28d46e07e8ed6aafb18ce9275c682027fc525a6a366e033ff85c0b7389a2"} Dec 04 00:56:33 crc kubenswrapper[4764]: I1204 00:56:33.336908 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerStarted","Data":"75b6cb451dd3b93cd076fc00c705aeb2be7b18886cabd734dd76d66e268c75ea"} Dec 04 00:56:34 crc kubenswrapper[4764]: I1204 00:56:34.351408 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerID="75b6cb451dd3b93cd076fc00c705aeb2be7b18886cabd734dd76d66e268c75ea" exitCode=0 Dec 04 00:56:34 crc kubenswrapper[4764]: I1204 00:56:34.351522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerDied","Data":"75b6cb451dd3b93cd076fc00c705aeb2be7b18886cabd734dd76d66e268c75ea"} Dec 04 00:56:35 crc kubenswrapper[4764]: I1204 00:56:35.362268 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerStarted","Data":"9a21c0b0578332ba0abdab678a0405c0de488e3cf118fe128c806b6325813b34"} Dec 04 00:56:35 crc kubenswrapper[4764]: I1204 00:56:35.395170 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qhsbd" podStartSLOduration=2.926876597 podStartE2EDuration="5.395150827s" podCreationTimestamp="2025-12-04 00:56:30 +0000 UTC" firstStartedPulling="2025-12-04 00:56:32.330703382 +0000 UTC m=+4528.092027833" lastFinishedPulling="2025-12-04 00:56:34.798977612 +0000 UTC m=+4530.560302063" observedRunningTime="2025-12-04 00:56:35.392490071 +0000 UTC m=+4531.153814512" watchObservedRunningTime="2025-12-04 00:56:35.395150827 +0000 UTC m=+4531.156475238" Dec 04 00:56:41 crc kubenswrapper[4764]: I1204 00:56:41.132864 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:41 crc kubenswrapper[4764]: I1204 00:56:41.133590 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:41 crc kubenswrapper[4764]: I1204 00:56:41.199106 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:41 crc kubenswrapper[4764]: I1204 00:56:41.482967 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:41 crc kubenswrapper[4764]: I1204 00:56:41.551958 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:43 crc kubenswrapper[4764]: I1204 00:56:43.435825 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qhsbd" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="registry-server" containerID="cri-o://9a21c0b0578332ba0abdab678a0405c0de488e3cf118fe128c806b6325813b34" gracePeriod=2 Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.446110 4764 generic.go:334] "Generic (PLEG): container finished" podID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerID="9a21c0b0578332ba0abdab678a0405c0de488e3cf118fe128c806b6325813b34" exitCode=0 Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.446150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerDied","Data":"9a21c0b0578332ba0abdab678a0405c0de488e3cf118fe128c806b6325813b34"} Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.524059 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.588556 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj9h6\" (UniqueName: \"kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6\") pod \"7b49f27f-4051-4a17-9168-1be5cb4c7601\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.588619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content\") pod \"7b49f27f-4051-4a17-9168-1be5cb4c7601\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.588636 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities\") pod \"7b49f27f-4051-4a17-9168-1be5cb4c7601\" (UID: \"7b49f27f-4051-4a17-9168-1be5cb4c7601\") " Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.589627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities" (OuterVolumeSpecName: "utilities") pod "7b49f27f-4051-4a17-9168-1be5cb4c7601" (UID: "7b49f27f-4051-4a17-9168-1be5cb4c7601"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.600907 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6" (OuterVolumeSpecName: "kube-api-access-hj9h6") pod "7b49f27f-4051-4a17-9168-1be5cb4c7601" (UID: "7b49f27f-4051-4a17-9168-1be5cb4c7601"). InnerVolumeSpecName "kube-api-access-hj9h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.637131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7b49f27f-4051-4a17-9168-1be5cb4c7601" (UID: "7b49f27f-4051-4a17-9168-1be5cb4c7601"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.689755 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj9h6\" (UniqueName: \"kubernetes.io/projected/7b49f27f-4051-4a17-9168-1be5cb4c7601-kube-api-access-hj9h6\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.689788 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:44 crc kubenswrapper[4764]: I1204 00:56:44.689799 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7b49f27f-4051-4a17-9168-1be5cb4c7601-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.455040 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qhsbd" event={"ID":"7b49f27f-4051-4a17-9168-1be5cb4c7601","Type":"ContainerDied","Data":"674a28d46e07e8ed6aafb18ce9275c682027fc525a6a366e033ff85c0b7389a2"} Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.455104 4764 scope.go:117] "RemoveContainer" containerID="9a21c0b0578332ba0abdab678a0405c0de488e3cf118fe128c806b6325813b34" Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.455136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qhsbd" Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.475405 4764 scope.go:117] "RemoveContainer" containerID="75b6cb451dd3b93cd076fc00c705aeb2be7b18886cabd734dd76d66e268c75ea" Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.492785 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.500001 4764 scope.go:117] "RemoveContainer" containerID="3fd020c95d21d8845192fbc652ff649f18d90a5e3caabc57a5a8d7ff32fcffb2" Dec 04 00:56:45 crc kubenswrapper[4764]: I1204 00:56:45.500824 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qhsbd"] Dec 04 00:56:46 crc kubenswrapper[4764]: I1204 00:56:46.556291 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" path="/var/lib/kubelet/pods/7b49f27f-4051-4a17-9168-1be5cb4c7601/volumes" Dec 04 00:56:50 crc kubenswrapper[4764]: I1204 00:56:50.869652 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:56:50 crc kubenswrapper[4764]: I1204 00:56:50.870176 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.278236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-x6gzw"] Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.285826 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-x6gzw"] Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.426581 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-l2p2x"] Dec 04 00:56:53 crc kubenswrapper[4764]: E1204 00:56:53.426956 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="extract-utilities" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.426978 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="extract-utilities" Dec 04 00:56:53 crc kubenswrapper[4764]: E1204 00:56:53.427005 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="registry-server" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.427014 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="registry-server" Dec 04 00:56:53 crc kubenswrapper[4764]: E1204 00:56:53.427027 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="extract-content" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.427035 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="extract-content" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.427209 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b49f27f-4051-4a17-9168-1be5cb4c7601" containerName="registry-server" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.427777 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.430055 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.430309 4764 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrrvd" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.430489 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.433279 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.438582 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l2p2x"] Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.449081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.449129 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnzww\" (UniqueName: \"kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.449202 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.550881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.550923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnzww\" (UniqueName: \"kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.550975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.551255 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.551608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.567975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnzww\" (UniqueName: \"kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww\") pod \"crc-storage-crc-l2p2x\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:53 crc kubenswrapper[4764]: I1204 00:56:53.743319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:54 crc kubenswrapper[4764]: I1204 00:56:54.267939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-l2p2x"] Dec 04 00:56:54 crc kubenswrapper[4764]: I1204 00:56:54.557097 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc55aa0-c11a-4b89-a6bc-38d4967c5204" path="/var/lib/kubelet/pods/9bc55aa0-c11a-4b89-a6bc-38d4967c5204/volumes" Dec 04 00:56:54 crc kubenswrapper[4764]: I1204 00:56:54.557757 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l2p2x" event={"ID":"d39cc839-2701-412f-9f0c-c3e489fa3bbe","Type":"ContainerStarted","Data":"78975feb4eae46fe15d1a3b7ff27cdd16616a918d8ea652ea43146a97e9f3e3b"} Dec 04 00:56:55 crc kubenswrapper[4764]: I1204 00:56:55.560059 4764 generic.go:334] "Generic (PLEG): container finished" podID="d39cc839-2701-412f-9f0c-c3e489fa3bbe" containerID="66ae9d1d2cdeee9cbc403c41827d641c7dd579109c0c30a40637308370bfe30b" exitCode=0 Dec 04 00:56:55 crc kubenswrapper[4764]: I1204 00:56:55.560110 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l2p2x" event={"ID":"d39cc839-2701-412f-9f0c-c3e489fa3bbe","Type":"ContainerDied","Data":"66ae9d1d2cdeee9cbc403c41827d641c7dd579109c0c30a40637308370bfe30b"} Dec 04 00:56:56 crc kubenswrapper[4764]: I1204 00:56:56.902139 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.016790 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnzww\" (UniqueName: \"kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww\") pod \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.016957 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage\") pod \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.017392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d39cc839-2701-412f-9f0c-c3e489fa3bbe" (UID: "d39cc839-2701-412f-9f0c-c3e489fa3bbe"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.017776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt\") pod \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\" (UID: \"d39cc839-2701-412f-9f0c-c3e489fa3bbe\") " Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.018271 4764 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d39cc839-2701-412f-9f0c-c3e489fa3bbe-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.023114 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww" (OuterVolumeSpecName: "kube-api-access-wnzww") pod "d39cc839-2701-412f-9f0c-c3e489fa3bbe" (UID: "d39cc839-2701-412f-9f0c-c3e489fa3bbe"). InnerVolumeSpecName "kube-api-access-wnzww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.041690 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d39cc839-2701-412f-9f0c-c3e489fa3bbe" (UID: "d39cc839-2701-412f-9f0c-c3e489fa3bbe"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.119654 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnzww\" (UniqueName: \"kubernetes.io/projected/d39cc839-2701-412f-9f0c-c3e489fa3bbe-kube-api-access-wnzww\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.119695 4764 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d39cc839-2701-412f-9f0c-c3e489fa3bbe-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.577084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-l2p2x" event={"ID":"d39cc839-2701-412f-9f0c-c3e489fa3bbe","Type":"ContainerDied","Data":"78975feb4eae46fe15d1a3b7ff27cdd16616a918d8ea652ea43146a97e9f3e3b"} Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.577342 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78975feb4eae46fe15d1a3b7ff27cdd16616a918d8ea652ea43146a97e9f3e3b" Dec 04 00:56:57 crc kubenswrapper[4764]: I1204 00:56:57.577139 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-l2p2x" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.224809 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-l2p2x"] Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.231790 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-l2p2x"] Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.375889 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-d7bfn"] Dec 04 00:56:59 crc kubenswrapper[4764]: E1204 00:56:59.376406 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39cc839-2701-412f-9f0c-c3e489fa3bbe" containerName="storage" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.376429 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39cc839-2701-412f-9f0c-c3e489fa3bbe" containerName="storage" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.376704 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39cc839-2701-412f-9f0c-c3e489fa3bbe" containerName="storage" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.378630 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.381907 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.382427 4764 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-jrrvd" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.382473 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.382942 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.388712 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d7bfn"] Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.458114 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6vvk\" (UniqueName: \"kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.458198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.458311 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.560429 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6vvk\" (UniqueName: \"kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.560522 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.560612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.561012 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.561545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.590184 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6vvk\" (UniqueName: \"kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk\") pod \"crc-storage-crc-d7bfn\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:56:59 crc kubenswrapper[4764]: I1204 00:56:59.705371 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:57:00 crc kubenswrapper[4764]: I1204 00:57:00.243469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-d7bfn"] Dec 04 00:57:00 crc kubenswrapper[4764]: W1204 00:57:00.254070 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe2b8aed_5b96_45b4_9d89_654db77a00cd.slice/crio-8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15 WatchSource:0}: Error finding container 8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15: Status 404 returned error can't find the container with id 8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15 Dec 04 00:57:00 crc kubenswrapper[4764]: I1204 00:57:00.558050 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d39cc839-2701-412f-9f0c-c3e489fa3bbe" path="/var/lib/kubelet/pods/d39cc839-2701-412f-9f0c-c3e489fa3bbe/volumes" Dec 04 00:57:00 crc kubenswrapper[4764]: I1204 00:57:00.605109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d7bfn" event={"ID":"fe2b8aed-5b96-45b4-9d89-654db77a00cd","Type":"ContainerStarted","Data":"8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15"} Dec 04 00:57:01 crc kubenswrapper[4764]: I1204 00:57:01.613806 4764 generic.go:334] "Generic (PLEG): container finished" podID="fe2b8aed-5b96-45b4-9d89-654db77a00cd" containerID="3d40f0c6e2b77f4a55de73ed882a467bd78b53703cb464e8b7de34d889a1e416" exitCode=0 Dec 04 00:57:01 crc kubenswrapper[4764]: I1204 00:57:01.614133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d7bfn" event={"ID":"fe2b8aed-5b96-45b4-9d89-654db77a00cd","Type":"ContainerDied","Data":"3d40f0c6e2b77f4a55de73ed882a467bd78b53703cb464e8b7de34d889a1e416"} Dec 04 00:57:02 crc kubenswrapper[4764]: I1204 00:57:02.954588 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.033131 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6vvk\" (UniqueName: \"kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk\") pod \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.033191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage\") pod \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.033331 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt\") pod \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\" (UID: \"fe2b8aed-5b96-45b4-9d89-654db77a00cd\") " Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.033680 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fe2b8aed-5b96-45b4-9d89-654db77a00cd" (UID: "fe2b8aed-5b96-45b4-9d89-654db77a00cd"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.044642 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk" (OuterVolumeSpecName: "kube-api-access-f6vvk") pod "fe2b8aed-5b96-45b4-9d89-654db77a00cd" (UID: "fe2b8aed-5b96-45b4-9d89-654db77a00cd"). InnerVolumeSpecName "kube-api-access-f6vvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.063987 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fe2b8aed-5b96-45b4-9d89-654db77a00cd" (UID: "fe2b8aed-5b96-45b4-9d89-654db77a00cd"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.134802 4764 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe2b8aed-5b96-45b4-9d89-654db77a00cd-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.134836 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6vvk\" (UniqueName: \"kubernetes.io/projected/fe2b8aed-5b96-45b4-9d89-654db77a00cd-kube-api-access-f6vvk\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.134849 4764 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe2b8aed-5b96-45b4-9d89-654db77a00cd-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.634289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-d7bfn" event={"ID":"fe2b8aed-5b96-45b4-9d89-654db77a00cd","Type":"ContainerDied","Data":"8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15"} Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.634347 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-d7bfn" Dec 04 00:57:03 crc kubenswrapper[4764]: I1204 00:57:03.634359 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d9372c7af9815c3ad1de684415a26771422a9223da47486d1904bb2108dcf15" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.410119 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:08 crc kubenswrapper[4764]: E1204 00:57:08.411005 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2b8aed-5b96-45b4-9d89-654db77a00cd" containerName="storage" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.411019 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2b8aed-5b96-45b4-9d89-654db77a00cd" containerName="storage" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.411146 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2b8aed-5b96-45b4-9d89-654db77a00cd" containerName="storage" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.412186 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.427900 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.513656 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz55k\" (UniqueName: \"kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.514016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.514107 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.616395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz55k\" (UniqueName: \"kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.616523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.616619 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.617662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.617662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.642195 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz55k\" (UniqueName: \"kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k\") pod \"certified-operators-clbdx\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:08 crc kubenswrapper[4764]: I1204 00:57:08.777161 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:09 crc kubenswrapper[4764]: I1204 00:57:09.311578 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:09 crc kubenswrapper[4764]: I1204 00:57:09.683368 4764 generic.go:334] "Generic (PLEG): container finished" podID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerID="36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b" exitCode=0 Dec 04 00:57:09 crc kubenswrapper[4764]: I1204 00:57:09.683457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerDied","Data":"36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b"} Dec 04 00:57:09 crc kubenswrapper[4764]: I1204 00:57:09.683704 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerStarted","Data":"c76a4f73184ddcc0c998ff4f75e6038cfc1aceb8b1f050a7b70da99b421437d6"} Dec 04 00:57:09 crc kubenswrapper[4764]: I1204 00:57:09.969604 4764 scope.go:117] "RemoveContainer" containerID="ebfb6447773808600dfd68ef4a8ebc888911b18335cb3bd557ac538309ce7f02" Dec 04 00:57:10 crc kubenswrapper[4764]: I1204 00:57:10.696019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerStarted","Data":"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85"} Dec 04 00:57:11 crc kubenswrapper[4764]: I1204 00:57:11.707231 4764 generic.go:334] "Generic (PLEG): container finished" podID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerID="8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85" exitCode=0 Dec 04 00:57:11 crc kubenswrapper[4764]: I1204 00:57:11.707295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerDied","Data":"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85"} Dec 04 00:57:12 crc kubenswrapper[4764]: I1204 00:57:12.724369 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerStarted","Data":"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d"} Dec 04 00:57:12 crc kubenswrapper[4764]: I1204 00:57:12.746621 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-clbdx" podStartSLOduration=2.327623025 podStartE2EDuration="4.746603759s" podCreationTimestamp="2025-12-04 00:57:08 +0000 UTC" firstStartedPulling="2025-12-04 00:57:09.696173221 +0000 UTC m=+4565.457497662" lastFinishedPulling="2025-12-04 00:57:12.115153945 +0000 UTC m=+4567.876478396" observedRunningTime="2025-12-04 00:57:12.745260656 +0000 UTC m=+4568.506585107" watchObservedRunningTime="2025-12-04 00:57:12.746603759 +0000 UTC m=+4568.507928180" Dec 04 00:57:18 crc kubenswrapper[4764]: I1204 00:57:18.777681 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:18 crc kubenswrapper[4764]: I1204 00:57:18.778557 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:18 crc kubenswrapper[4764]: I1204 00:57:18.818581 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:19 crc kubenswrapper[4764]: I1204 00:57:19.909678 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:19 crc kubenswrapper[4764]: I1204 00:57:19.964157 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:20 crc kubenswrapper[4764]: I1204 00:57:20.868852 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 00:57:20 crc kubenswrapper[4764]: I1204 00:57:20.868942 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 00:57:20 crc kubenswrapper[4764]: I1204 00:57:20.869045 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 00:57:20 crc kubenswrapper[4764]: I1204 00:57:20.870089 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 00:57:20 crc kubenswrapper[4764]: I1204 00:57:20.870208 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" gracePeriod=600 Dec 04 00:57:21 crc kubenswrapper[4764]: I1204 00:57:21.822229 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" exitCode=0 Dec 04 00:57:21 crc kubenswrapper[4764]: I1204 00:57:21.822303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2"} Dec 04 00:57:21 crc kubenswrapper[4764]: I1204 00:57:21.822371 4764 scope.go:117] "RemoveContainer" containerID="3142c9df3f9a8748567ccbbdeae293cfb794b8a558723085d6b857f15dc9fb59" Dec 04 00:57:21 crc kubenswrapper[4764]: I1204 00:57:21.822457 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-clbdx" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="registry-server" containerID="cri-o://d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d" gracePeriod=2 Dec 04 00:57:22 crc kubenswrapper[4764]: E1204 00:57:22.148064 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.348901 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.446539 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content\") pod \"70596833-bc78-4572-8a74-d3c5ebbc94f2\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.446609 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities\") pod \"70596833-bc78-4572-8a74-d3c5ebbc94f2\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.446753 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz55k\" (UniqueName: \"kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k\") pod \"70596833-bc78-4572-8a74-d3c5ebbc94f2\" (UID: \"70596833-bc78-4572-8a74-d3c5ebbc94f2\") " Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.447604 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities" (OuterVolumeSpecName: "utilities") pod "70596833-bc78-4572-8a74-d3c5ebbc94f2" (UID: "70596833-bc78-4572-8a74-d3c5ebbc94f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.455482 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k" (OuterVolumeSpecName: "kube-api-access-hz55k") pod "70596833-bc78-4572-8a74-d3c5ebbc94f2" (UID: "70596833-bc78-4572-8a74-d3c5ebbc94f2"). InnerVolumeSpecName "kube-api-access-hz55k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.528944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "70596833-bc78-4572-8a74-d3c5ebbc94f2" (UID: "70596833-bc78-4572-8a74-d3c5ebbc94f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.548800 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.549137 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz55k\" (UniqueName: \"kubernetes.io/projected/70596833-bc78-4572-8a74-d3c5ebbc94f2-kube-api-access-hz55k\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.549152 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/70596833-bc78-4572-8a74-d3c5ebbc94f2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.835700 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:57:22 crc kubenswrapper[4764]: E1204 00:57:22.836032 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.839091 4764 generic.go:334] "Generic (PLEG): container finished" podID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerID="d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d" exitCode=0 Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.839160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerDied","Data":"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d"} Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.839229 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clbdx" event={"ID":"70596833-bc78-4572-8a74-d3c5ebbc94f2","Type":"ContainerDied","Data":"c76a4f73184ddcc0c998ff4f75e6038cfc1aceb8b1f050a7b70da99b421437d6"} Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.839272 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clbdx" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.839278 4764 scope.go:117] "RemoveContainer" containerID="d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.888143 4764 scope.go:117] "RemoveContainer" containerID="8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.905572 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.917207 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-clbdx"] Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.921930 4764 scope.go:117] "RemoveContainer" containerID="36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.944654 4764 scope.go:117] "RemoveContainer" containerID="d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d" Dec 04 00:57:22 crc kubenswrapper[4764]: E1204 00:57:22.945093 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d\": container with ID starting with d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d not found: ID does not exist" containerID="d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.945138 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d"} err="failed to get container status \"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d\": rpc error: code = NotFound desc = could not find container \"d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d\": container with ID starting with d2f0211220ddc60397b6579d3bf08e1e002023e98899346cb38646650308121d not found: ID does not exist" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.945167 4764 scope.go:117] "RemoveContainer" containerID="8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85" Dec 04 00:57:22 crc kubenswrapper[4764]: E1204 00:57:22.945517 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85\": container with ID starting with 8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85 not found: ID does not exist" containerID="8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.945556 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85"} err="failed to get container status \"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85\": rpc error: code = NotFound desc = could not find container \"8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85\": container with ID starting with 8dd2cc0976002b0dedb8166f367b1f85f9c05ea07fff1d59513ce1d4b6b4de85 not found: ID does not exist" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.945585 4764 scope.go:117] "RemoveContainer" containerID="36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b" Dec 04 00:57:22 crc kubenswrapper[4764]: E1204 00:57:22.945852 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b\": container with ID starting with 36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b not found: ID does not exist" containerID="36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b" Dec 04 00:57:22 crc kubenswrapper[4764]: I1204 00:57:22.945872 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b"} err="failed to get container status \"36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b\": rpc error: code = NotFound desc = could not find container \"36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b\": container with ID starting with 36ec7ba4913b7b2ce730e6c6a5715825dbf964214064cc8e113eb9cf09394d5b not found: ID does not exist" Dec 04 00:57:24 crc kubenswrapper[4764]: I1204 00:57:24.565680 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" path="/var/lib/kubelet/pods/70596833-bc78-4572-8a74-d3c5ebbc94f2/volumes" Dec 04 00:57:36 crc kubenswrapper[4764]: I1204 00:57:36.546634 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:57:36 crc kubenswrapper[4764]: E1204 00:57:36.547580 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:57:51 crc kubenswrapper[4764]: I1204 00:57:51.545534 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:57:51 crc kubenswrapper[4764]: E1204 00:57:51.546486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:58:04 crc kubenswrapper[4764]: I1204 00:58:04.553484 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:58:04 crc kubenswrapper[4764]: E1204 00:58:04.556455 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:58:16 crc kubenswrapper[4764]: I1204 00:58:16.546297 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:58:16 crc kubenswrapper[4764]: E1204 00:58:16.547153 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:58:31 crc kubenswrapper[4764]: I1204 00:58:31.546102 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:58:31 crc kubenswrapper[4764]: E1204 00:58:31.547187 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:58:42 crc kubenswrapper[4764]: I1204 00:58:42.546056 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:58:42 crc kubenswrapper[4764]: E1204 00:58:42.546879 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:58:55 crc kubenswrapper[4764]: I1204 00:58:55.546343 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:58:55 crc kubenswrapper[4764]: E1204 00:58:55.547046 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:59:06 crc kubenswrapper[4764]: I1204 00:59:06.546184 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:59:06 crc kubenswrapper[4764]: E1204 00:59:06.547276 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:59:18 crc kubenswrapper[4764]: I1204 00:59:18.546192 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:59:18 crc kubenswrapper[4764]: E1204 00:59:18.547101 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:59:32 crc kubenswrapper[4764]: I1204 00:59:32.545590 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:59:32 crc kubenswrapper[4764]: E1204 00:59:32.546538 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:59:44 crc kubenswrapper[4764]: I1204 00:59:44.553114 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:59:44 crc kubenswrapper[4764]: E1204 00:59:44.555111 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 00:59:58 crc kubenswrapper[4764]: I1204 00:59:58.546289 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 00:59:58 crc kubenswrapper[4764]: E1204 00:59:58.547448 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.168518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk"] Dec 04 01:00:00 crc kubenswrapper[4764]: E1204 01:00:00.168888 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="extract-utilities" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.168905 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="extract-utilities" Dec 04 01:00:00 crc kubenswrapper[4764]: E1204 01:00:00.168939 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="registry-server" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.168949 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="registry-server" Dec 04 01:00:00 crc kubenswrapper[4764]: E1204 01:00:00.168967 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="extract-content" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.168978 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="extract-content" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.169611 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="70596833-bc78-4572-8a74-d3c5ebbc94f2" containerName="registry-server" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.170249 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.178245 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.178403 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.191188 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk"] Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.237382 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.237502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.237560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbd62\" (UniqueName: \"kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.338868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.338979 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.339044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbd62\" (UniqueName: \"kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.340606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.349975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.373890 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbd62\" (UniqueName: \"kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62\") pod \"collect-profiles-29413500-w95sk\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.499289 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:00 crc kubenswrapper[4764]: W1204 01:00:00.806838 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod784b53ea_54fc_406b_8bc9_51e437063d4c.slice/crio-7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552 WatchSource:0}: Error finding container 7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552: Status 404 returned error can't find the container with id 7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552 Dec 04 01:00:00 crc kubenswrapper[4764]: I1204 01:00:00.811665 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk"] Dec 04 01:00:01 crc kubenswrapper[4764]: I1204 01:00:01.400205 4764 generic.go:334] "Generic (PLEG): container finished" podID="784b53ea-54fc-406b-8bc9-51e437063d4c" containerID="b14fa99ec99f7a038ec7eae9833282d82da41584777b00c049a8e931eb4eb614" exitCode=0 Dec 04 01:00:01 crc kubenswrapper[4764]: I1204 01:00:01.400287 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" event={"ID":"784b53ea-54fc-406b-8bc9-51e437063d4c","Type":"ContainerDied","Data":"b14fa99ec99f7a038ec7eae9833282d82da41584777b00c049a8e931eb4eb614"} Dec 04 01:00:01 crc kubenswrapper[4764]: I1204 01:00:01.400541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" event={"ID":"784b53ea-54fc-406b-8bc9-51e437063d4c","Type":"ContainerStarted","Data":"7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552"} Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.237231 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.283163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume\") pod \"784b53ea-54fc-406b-8bc9-51e437063d4c\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.283265 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbd62\" (UniqueName: \"kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62\") pod \"784b53ea-54fc-406b-8bc9-51e437063d4c\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.283357 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume\") pod \"784b53ea-54fc-406b-8bc9-51e437063d4c\" (UID: \"784b53ea-54fc-406b-8bc9-51e437063d4c\") " Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.284880 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume" (OuterVolumeSpecName: "config-volume") pod "784b53ea-54fc-406b-8bc9-51e437063d4c" (UID: "784b53ea-54fc-406b-8bc9-51e437063d4c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.289922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "784b53ea-54fc-406b-8bc9-51e437063d4c" (UID: "784b53ea-54fc-406b-8bc9-51e437063d4c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.290096 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62" (OuterVolumeSpecName: "kube-api-access-rbd62") pod "784b53ea-54fc-406b-8bc9-51e437063d4c" (UID: "784b53ea-54fc-406b-8bc9-51e437063d4c"). InnerVolumeSpecName "kube-api-access-rbd62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.384553 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/784b53ea-54fc-406b-8bc9-51e437063d4c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.384588 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/784b53ea-54fc-406b-8bc9-51e437063d4c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.384599 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbd62\" (UniqueName: \"kubernetes.io/projected/784b53ea-54fc-406b-8bc9-51e437063d4c-kube-api-access-rbd62\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.419109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" event={"ID":"784b53ea-54fc-406b-8bc9-51e437063d4c","Type":"ContainerDied","Data":"7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552"} Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.419154 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk" Dec 04 01:00:03 crc kubenswrapper[4764]: I1204 01:00:03.419175 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ad74dff58825571d386482c3174c619efcb396b1197f8665d452cb657a47552" Dec 04 01:00:04 crc kubenswrapper[4764]: I1204 01:00:04.339345 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn"] Dec 04 01:00:04 crc kubenswrapper[4764]: I1204 01:00:04.351034 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413455-7slcn"] Dec 04 01:00:04 crc kubenswrapper[4764]: I1204 01:00:04.561988 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20927d90-3909-4581-879a-0aaf6d4997c7" path="/var/lib/kubelet/pods/20927d90-3909-4581-879a-0aaf6d4997c7/volumes" Dec 04 01:00:09 crc kubenswrapper[4764]: I1204 01:00:09.546328 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:00:09 crc kubenswrapper[4764]: E1204 01:00:09.547475 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:00:10 crc kubenswrapper[4764]: I1204 01:00:10.099896 4764 scope.go:117] "RemoveContainer" containerID="d9a170544faac2c83798b4c713fc3edde3f3b3efa9b309a497a722a6d2a4d8d2" Dec 04 01:00:24 crc kubenswrapper[4764]: I1204 01:00:24.554465 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:00:24 crc kubenswrapper[4764]: E1204 01:00:24.555532 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.495872 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:38 crc kubenswrapper[4764]: E1204 01:00:38.496532 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784b53ea-54fc-406b-8bc9-51e437063d4c" containerName="collect-profiles" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.496544 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="784b53ea-54fc-406b-8bc9-51e437063d4c" containerName="collect-profiles" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.496699 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="784b53ea-54fc-406b-8bc9-51e437063d4c" containerName="collect-profiles" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.497387 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.499238 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.500008 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.500172 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.501032 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2886g" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.501181 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.509179 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.546180 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:00:38 crc kubenswrapper[4764]: E1204 01:00:38.546414 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.634335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.634376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7ctk\" (UniqueName: \"kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.634432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.735637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.735671 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7ctk\" (UniqueName: \"kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.735768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.736442 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.736636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.784068 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.786050 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.806529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7ctk\" (UniqueName: \"kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk\") pod \"dnsmasq-dns-76d8c4d77f-rzsmc\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.813805 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.814503 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.942398 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.942446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:38 crc kubenswrapper[4764]: I1204 01:00:38.942495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc85v\" (UniqueName: \"kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.043817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.044171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.044242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc85v\" (UniqueName: \"kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.045625 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.046400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.084530 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc85v\" (UniqueName: \"kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v\") pod \"dnsmasq-dns-7cbb4f659c-jrfp8\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.187344 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.440006 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.600402 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:00:39 crc kubenswrapper[4764]: W1204 01:00:39.604908 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6359974_fc56_4ce2_97a7_adb26e909a43.slice/crio-8d3310c1f6221ba78bbaa28d0d68e4b6acc74f54779347b31e6ffb56675a2ea7 WatchSource:0}: Error finding container 8d3310c1f6221ba78bbaa28d0d68e4b6acc74f54779347b31e6ffb56675a2ea7: Status 404 returned error can't find the container with id 8d3310c1f6221ba78bbaa28d0d68e4b6acc74f54779347b31e6ffb56675a2ea7 Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.668363 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.673504 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.675892 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.676033 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lgr2q" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.676131 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.676268 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.677539 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.722635 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.753101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" event={"ID":"c6359974-fc56-4ce2-97a7-adb26e909a43","Type":"ContainerStarted","Data":"8d3310c1f6221ba78bbaa28d0d68e4b6acc74f54779347b31e6ffb56675a2ea7"} Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.755287 4764 generic.go:334] "Generic (PLEG): container finished" podID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerID="610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72" exitCode=0 Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.755359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" event={"ID":"a766a4e6-38b0-4e9b-a711-08cf39888ddb","Type":"ContainerDied","Data":"610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72"} Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.755445 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" event={"ID":"a766a4e6-38b0-4e9b-a711-08cf39888ddb","Type":"ContainerStarted","Data":"b5301dd8d6fe0329be957e2986a5e54a20bcfeb1c32c80eb11f7f1e15ec753ac"} Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.856510 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.856848 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.856930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.856988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.857025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.857046 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpj9k\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.857069 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.857086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.857108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.918642 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.919761 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.923825 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.924054 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.924185 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.924800 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-mcmkm" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.932109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.944748 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.945682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.948075 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dp7sr" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.955977 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.959730 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968107 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968146 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpj9k\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968206 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.968351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.969073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.970260 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.970862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.971961 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.989241 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.989821 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.989920 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.989950 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/958dc5b9af5392dde11d4148f641f2bf2e13799ac6f1c7ef73161ca8aab9ec68/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.991382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:39 crc kubenswrapper[4764]: I1204 01:00:39.995827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpj9k\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.015773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.037934 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " pod="openstack/rabbitmq-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.070601 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.070976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-21cfab75-8d63-418e-95fa-55100703dac0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21cfab75-8d63-418e-95fa-55100703dac0\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4pt\" (UniqueName: \"kubernetes.io/projected/96b866f4-5df7-4e21-96a9-f50903969fde-kube-api-access-hc4pt\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071314 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26vqh\" (UniqueName: \"kubernetes.io/projected/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kube-api-access-26vqh\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071427 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071811 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-config-data\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-kolla-config\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.071980 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.104333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.106109 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.108261 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.108417 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.108566 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.108835 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6dp4n" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.109135 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.123626 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173267 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-21cfab75-8d63-418e-95fa-55100703dac0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21cfab75-8d63-418e-95fa-55100703dac0\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173295 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4pt\" (UniqueName: \"kubernetes.io/projected/96b866f4-5df7-4e21-96a9-f50903969fde-kube-api-access-hc4pt\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173357 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26vqh\" (UniqueName: \"kubernetes.io/projected/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kube-api-access-26vqh\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173384 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-config-data\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-kolla-config\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.173548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.174854 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-generated\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.175059 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-kolla-config\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.175075 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96b866f4-5df7-4e21-96a9-f50903969fde-config-data\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.175347 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kolla-config\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.176098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-config-data-default\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.178030 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-operator-scripts\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.178327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.178380 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.178420 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-21cfab75-8d63-418e-95fa-55100703dac0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21cfab75-8d63-418e-95fa-55100703dac0\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e61b290e5fbb1e12d321a083e9c0eecd2bb0d7b0b0a52dbeccb92d5f1fe95031/globalmount\"" pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.179370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.192370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4pt\" (UniqueName: \"kubernetes.io/projected/96b866f4-5df7-4e21-96a9-f50903969fde-kube-api-access-hc4pt\") pod \"memcached-0\" (UID: \"96b866f4-5df7-4e21-96a9-f50903969fde\") " pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.201023 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26vqh\" (UniqueName: \"kubernetes.io/projected/8b1bbbf7-6f68-4a5a-897d-cfd533433b5a-kube-api-access-26vqh\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.209505 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-21cfab75-8d63-418e-95fa-55100703dac0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-21cfab75-8d63-418e-95fa-55100703dac0\") pod \"openstack-galera-0\" (UID: \"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a\") " pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.268002 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275323 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275398 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.275608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbfx\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.290394 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.317274 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378383 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378469 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378498 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378548 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378590 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378620 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378656 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.378691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbfx\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.380134 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.382236 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.382520 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.382641 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.384940 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.386583 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.386626 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/23e57f0cc5e6ab422f374ac541df9af06d7e27fee032ef5a63e8a38eec804fcf/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.389512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.400272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.402295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbfx\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.444537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: W1204 01:00:40.722822 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b1bbbf7_6f68_4a5a_897d_cfd533433b5a.slice/crio-611fe0c508d3675e973844fe069107ac370580f0fe7d7673751003d899bd8b5d WatchSource:0}: Error finding container 611fe0c508d3675e973844fe069107ac370580f0fe7d7673751003d899bd8b5d: Status 404 returned error can't find the container with id 611fe0c508d3675e973844fe069107ac370580f0fe7d7673751003d899bd8b5d Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.727185 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.732170 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.772566 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerID="e0d3c0a18dea79a0fc497e2976ff894fee01aaf2c1a065d7faf5425986973c1b" exitCode=0 Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.772651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" event={"ID":"c6359974-fc56-4ce2-97a7-adb26e909a43","Type":"ContainerDied","Data":"e0d3c0a18dea79a0fc497e2976ff894fee01aaf2c1a065d7faf5425986973c1b"} Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.786245 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" event={"ID":"a766a4e6-38b0-4e9b-a711-08cf39888ddb","Type":"ContainerStarted","Data":"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc"} Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.786408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.798816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a","Type":"ContainerStarted","Data":"611fe0c508d3675e973844fe069107ac370580f0fe7d7673751003d899bd8b5d"} Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.831256 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: W1204 01:00:40.831630 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96b866f4_5df7_4e21_96a9_f50903969fde.slice/crio-1ab656aaa87b2cee907c856579ae0c9c26c50e448334c812a4f6f9cdf7810440 WatchSource:0}: Error finding container 1ab656aaa87b2cee907c856579ae0c9c26c50e448334c812a4f6f9cdf7810440: Status 404 returned error can't find the container with id 1ab656aaa87b2cee907c856579ae0c9c26c50e448334c812a4f6f9cdf7810440 Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.849109 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:00:40 crc kubenswrapper[4764]: I1204 01:00:40.859212 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" podStartSLOduration=2.859198579 podStartE2EDuration="2.859198579s" podCreationTimestamp="2025-12-04 01:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:00:40.848615658 +0000 UTC m=+4776.609940059" watchObservedRunningTime="2025-12-04 01:00:40.859198579 +0000 UTC m=+4776.620522990" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.007005 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.008110 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.010103 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hp9kl" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.014212 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.014259 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.014215 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.028090 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089753 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkxn\" (UniqueName: \"kubernetes.io/projected/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kube-api-access-ctkxn\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089794 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089820 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089944 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.089966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190871 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkxn\" (UniqueName: \"kubernetes.io/projected/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kube-api-access-ctkxn\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.190999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.191025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.191043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.191774 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.192251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.192581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.193523 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.193545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.193568 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/948edff62e61c68e0571f8005e6af571644a638f38c5c0c63caaa32fbe6f6c2c/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.194576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.195707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.216730 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkxn\" (UniqueName: \"kubernetes.io/projected/8fecb71e-e1c0-490f-97f7-b58e4a3f7c01-kube-api-access-ctkxn\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.224208 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1269c4f-ce5d-40cb-b4c4-6516ebc65e23\") pod \"openstack-cell1-galera-0\" (UID: \"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01\") " pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.283384 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:00:41 crc kubenswrapper[4764]: W1204 01:00:41.286887 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c2a3f03_17c5_46d6_b074_76bcdbf93abb.slice/crio-1c49859029cfa7b9a9703652ea2ae76b1695d46ccf63070fd1adc0c1867dc7f7 WatchSource:0}: Error finding container 1c49859029cfa7b9a9703652ea2ae76b1695d46ccf63070fd1adc0c1867dc7f7: Status 404 returned error can't find the container with id 1c49859029cfa7b9a9703652ea2ae76b1695d46ccf63070fd1adc0c1867dc7f7 Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.331416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.753751 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 01:00:41 crc kubenswrapper[4764]: W1204 01:00:41.768387 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fecb71e_e1c0_490f_97f7_b58e4a3f7c01.slice/crio-ba758ccc4c55b3a16f299c2f145f3d2391703b55e79456c91e63e3e6b6cbb148 WatchSource:0}: Error finding container ba758ccc4c55b3a16f299c2f145f3d2391703b55e79456c91e63e3e6b6cbb148: Status 404 returned error can't find the container with id ba758ccc4c55b3a16f299c2f145f3d2391703b55e79456c91e63e3e6b6cbb148 Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.811835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" event={"ID":"c6359974-fc56-4ce2-97a7-adb26e909a43","Type":"ContainerStarted","Data":"e6219b97cc26ca4b52d230f62f58cc4e72c08a1b86db702f558a75bfbef87595"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.812242 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.815096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"96b866f4-5df7-4e21-96a9-f50903969fde","Type":"ContainerStarted","Data":"17e47af9571594818d3c1232a0b475af0dc35328aaca4d155b625fa32dc9a276"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.815137 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"96b866f4-5df7-4e21-96a9-f50903969fde","Type":"ContainerStarted","Data":"1ab656aaa87b2cee907c856579ae0c9c26c50e448334c812a4f6f9cdf7810440"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.815253 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.816516 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01","Type":"ContainerStarted","Data":"ba758ccc4c55b3a16f299c2f145f3d2391703b55e79456c91e63e3e6b6cbb148"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.818427 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerStarted","Data":"1c49859029cfa7b9a9703652ea2ae76b1695d46ccf63070fd1adc0c1867dc7f7"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.819293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerStarted","Data":"b509094648b50838b6b1312219ce120b48255a2d5e7098ff83cc740a98e16694"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.821824 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a","Type":"ContainerStarted","Data":"2422499064fc2d9bbed8de7d703dae13d98e8c95c6e287dd01e5c55ff083cf59"} Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.836209 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" podStartSLOduration=3.835625537 podStartE2EDuration="3.835625537s" podCreationTimestamp="2025-12-04 01:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:00:41.829404144 +0000 UTC m=+4777.590728565" watchObservedRunningTime="2025-12-04 01:00:41.835625537 +0000 UTC m=+4777.596949958" Dec 04 01:00:41 crc kubenswrapper[4764]: I1204 01:00:41.860337 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.860311295 podStartE2EDuration="2.860311295s" podCreationTimestamp="2025-12-04 01:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:00:41.85197883 +0000 UTC m=+4777.613303251" watchObservedRunningTime="2025-12-04 01:00:41.860311295 +0000 UTC m=+4777.621635716" Dec 04 01:00:42 crc kubenswrapper[4764]: I1204 01:00:42.833267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerStarted","Data":"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25"} Dec 04 01:00:42 crc kubenswrapper[4764]: I1204 01:00:42.836322 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01","Type":"ContainerStarted","Data":"d9e269e440f60618bac2433324b5f4032751eaa68c406399a0d567b4dbd06826"} Dec 04 01:00:42 crc kubenswrapper[4764]: I1204 01:00:42.839544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerStarted","Data":"0ced0743f6ac483667d77606480170536d057262dc75e970300d44031007053f"} Dec 04 01:00:45 crc kubenswrapper[4764]: I1204 01:00:45.869519 4764 generic.go:334] "Generic (PLEG): container finished" podID="8b1bbbf7-6f68-4a5a-897d-cfd533433b5a" containerID="2422499064fc2d9bbed8de7d703dae13d98e8c95c6e287dd01e5c55ff083cf59" exitCode=0 Dec 04 01:00:45 crc kubenswrapper[4764]: I1204 01:00:45.869918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a","Type":"ContainerDied","Data":"2422499064fc2d9bbed8de7d703dae13d98e8c95c6e287dd01e5c55ff083cf59"} Dec 04 01:00:46 crc kubenswrapper[4764]: I1204 01:00:46.884590 4764 generic.go:334] "Generic (PLEG): container finished" podID="8fecb71e-e1c0-490f-97f7-b58e4a3f7c01" containerID="d9e269e440f60618bac2433324b5f4032751eaa68c406399a0d567b4dbd06826" exitCode=0 Dec 04 01:00:46 crc kubenswrapper[4764]: I1204 01:00:46.884682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01","Type":"ContainerDied","Data":"d9e269e440f60618bac2433324b5f4032751eaa68c406399a0d567b4dbd06826"} Dec 04 01:00:46 crc kubenswrapper[4764]: I1204 01:00:46.888059 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"8b1bbbf7-6f68-4a5a-897d-cfd533433b5a","Type":"ContainerStarted","Data":"f2407e58de2bfc73c3117a55461a8232168d927fbd1a308551e7f0846e4eac1e"} Dec 04 01:00:47 crc kubenswrapper[4764]: I1204 01:00:47.902643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"8fecb71e-e1c0-490f-97f7-b58e4a3f7c01","Type":"ContainerStarted","Data":"5f7158d358e782985ef180d70f5b276099b7d9aa3c0da4cf9bd1874f0217ea44"} Dec 04 01:00:47 crc kubenswrapper[4764]: I1204 01:00:47.945695 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.945566047 podStartE2EDuration="9.945566047s" podCreationTimestamp="2025-12-04 01:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:00:46.964481005 +0000 UTC m=+4782.725805516" watchObservedRunningTime="2025-12-04 01:00:47.945566047 +0000 UTC m=+4783.706890488" Dec 04 01:00:47 crc kubenswrapper[4764]: I1204 01:00:47.960430 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.960400443 podStartE2EDuration="8.960400443s" podCreationTimestamp="2025-12-04 01:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:00:47.948228923 +0000 UTC m=+4783.709553394" watchObservedRunningTime="2025-12-04 01:00:47.960400443 +0000 UTC m=+4783.721724894" Dec 04 01:00:48 crc kubenswrapper[4764]: E1204 01:00:48.089746 4764 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:53768->38.102.83.13:39483: write tcp 38.102.83.13:53768->38.102.83.13:39483: write: broken pipe Dec 04 01:00:48 crc kubenswrapper[4764]: I1204 01:00:48.816689 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.189986 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.272999 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.273406 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="dnsmasq-dns" containerID="cri-o://d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc" gracePeriod=10 Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.776884 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.830051 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7ctk\" (UniqueName: \"kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk\") pod \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.830503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config\") pod \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.830827 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc\") pod \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\" (UID: \"a766a4e6-38b0-4e9b-a711-08cf39888ddb\") " Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.853172 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk" (OuterVolumeSpecName: "kube-api-access-f7ctk") pod "a766a4e6-38b0-4e9b-a711-08cf39888ddb" (UID: "a766a4e6-38b0-4e9b-a711-08cf39888ddb"). InnerVolumeSpecName "kube-api-access-f7ctk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.875562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config" (OuterVolumeSpecName: "config") pod "a766a4e6-38b0-4e9b-a711-08cf39888ddb" (UID: "a766a4e6-38b0-4e9b-a711-08cf39888ddb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.879328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a766a4e6-38b0-4e9b-a711-08cf39888ddb" (UID: "a766a4e6-38b0-4e9b-a711-08cf39888ddb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.921193 4764 generic.go:334] "Generic (PLEG): container finished" podID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerID="d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc" exitCode=0 Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.921236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" event={"ID":"a766a4e6-38b0-4e9b-a711-08cf39888ddb","Type":"ContainerDied","Data":"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc"} Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.921262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" event={"ID":"a766a4e6-38b0-4e9b-a711-08cf39888ddb","Type":"ContainerDied","Data":"b5301dd8d6fe0329be957e2986a5e54a20bcfeb1c32c80eb11f7f1e15ec753ac"} Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.921277 4764 scope.go:117] "RemoveContainer" containerID="d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.921386 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76d8c4d77f-rzsmc" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.932458 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.932494 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7ctk\" (UniqueName: \"kubernetes.io/projected/a766a4e6-38b0-4e9b-a711-08cf39888ddb-kube-api-access-f7ctk\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.932509 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a766a4e6-38b0-4e9b-a711-08cf39888ddb-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.940271 4764 scope.go:117] "RemoveContainer" containerID="610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.957667 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.965111 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76d8c4d77f-rzsmc"] Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.976021 4764 scope.go:117] "RemoveContainer" containerID="d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc" Dec 04 01:00:49 crc kubenswrapper[4764]: E1204 01:00:49.976358 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc\": container with ID starting with d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc not found: ID does not exist" containerID="d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.976393 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc"} err="failed to get container status \"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc\": rpc error: code = NotFound desc = could not find container \"d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc\": container with ID starting with d9bb6709fe3766e888e2fe90e79860dc17833f6b9698ae14a6b36653d38802fc not found: ID does not exist" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.976415 4764 scope.go:117] "RemoveContainer" containerID="610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72" Dec 04 01:00:49 crc kubenswrapper[4764]: E1204 01:00:49.976621 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72\": container with ID starting with 610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72 not found: ID does not exist" containerID="610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72" Dec 04 01:00:49 crc kubenswrapper[4764]: I1204 01:00:49.976647 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72"} err="failed to get container status \"610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72\": rpc error: code = NotFound desc = could not find container \"610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72\": container with ID starting with 610dea5f9915845a03e4cadb3275a59a9d36a2b4d2dd8ea619137a1a6d3aaf72 not found: ID does not exist" Dec 04 01:00:50 crc kubenswrapper[4764]: I1204 01:00:50.268658 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 01:00:50 crc kubenswrapper[4764]: I1204 01:00:50.268739 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 01:00:50 crc kubenswrapper[4764]: I1204 01:00:50.292888 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 01:00:50 crc kubenswrapper[4764]: I1204 01:00:50.557349 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" path="/var/lib/kubelet/pods/a766a4e6-38b0-4e9b-a711-08cf39888ddb/volumes" Dec 04 01:00:51 crc kubenswrapper[4764]: I1204 01:00:51.045070 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 01:00:51 crc kubenswrapper[4764]: I1204 01:00:51.154555 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 01:00:51 crc kubenswrapper[4764]: I1204 01:00:51.332368 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:51 crc kubenswrapper[4764]: I1204 01:00:51.332500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:52 crc kubenswrapper[4764]: I1204 01:00:52.007888 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:53 crc kubenswrapper[4764]: I1204 01:00:53.074203 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 01:00:53 crc kubenswrapper[4764]: I1204 01:00:53.651288 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:00:53 crc kubenswrapper[4764]: E1204 01:00:53.651648 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:01:08 crc kubenswrapper[4764]: I1204 01:01:08.546594 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:01:08 crc kubenswrapper[4764]: E1204 01:01:08.547598 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:01:15 crc kubenswrapper[4764]: I1204 01:01:15.159895 4764 generic.go:334] "Generic (PLEG): container finished" podID="3803716e-7167-4c44-b89d-c044d119c603" containerID="a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25" exitCode=0 Dec 04 01:01:15 crc kubenswrapper[4764]: I1204 01:01:15.160003 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerDied","Data":"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25"} Dec 04 01:01:16 crc kubenswrapper[4764]: I1204 01:01:16.170558 4764 generic.go:334] "Generic (PLEG): container finished" podID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerID="0ced0743f6ac483667d77606480170536d057262dc75e970300d44031007053f" exitCode=0 Dec 04 01:01:16 crc kubenswrapper[4764]: I1204 01:01:16.170671 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerDied","Data":"0ced0743f6ac483667d77606480170536d057262dc75e970300d44031007053f"} Dec 04 01:01:16 crc kubenswrapper[4764]: I1204 01:01:16.174865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerStarted","Data":"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3"} Dec 04 01:01:16 crc kubenswrapper[4764]: I1204 01:01:16.175108 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 01:01:16 crc kubenswrapper[4764]: I1204 01:01:16.227432 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.227407285 podStartE2EDuration="38.227407285s" podCreationTimestamp="2025-12-04 01:00:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:01:16.216814404 +0000 UTC m=+4811.978138835" watchObservedRunningTime="2025-12-04 01:01:16.227407285 +0000 UTC m=+4811.988731706" Dec 04 01:01:17 crc kubenswrapper[4764]: I1204 01:01:17.183376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerStarted","Data":"1c5451291012ebbad0f14508b09d871bb2c6a2f10a387e6aa173aed7da858179"} Dec 04 01:01:17 crc kubenswrapper[4764]: I1204 01:01:17.183930 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:22 crc kubenswrapper[4764]: I1204 01:01:22.546171 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:01:22 crc kubenswrapper[4764]: E1204 01:01:22.546882 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:01:30 crc kubenswrapper[4764]: I1204 01:01:30.320977 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 01:01:30 crc kubenswrapper[4764]: I1204 01:01:30.370880 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.370846758 podStartE2EDuration="51.370846758s" podCreationTimestamp="2025-12-04 01:00:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:01:17.207342609 +0000 UTC m=+4812.968667020" watchObservedRunningTime="2025-12-04 01:01:30.370846758 +0000 UTC m=+4826.132171239" Dec 04 01:01:30 crc kubenswrapper[4764]: I1204 01:01:30.736117 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.841816 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:01:33 crc kubenswrapper[4764]: E1204 01:01:33.842591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="dnsmasq-dns" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.842612 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="dnsmasq-dns" Dec 04 01:01:33 crc kubenswrapper[4764]: E1204 01:01:33.842626 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="init" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.842638 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="init" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.842979 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a766a4e6-38b0-4e9b-a711-08cf39888ddb" containerName="dnsmasq-dns" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.844245 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.856039 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.911448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.911516 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmld8\" (UniqueName: \"kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:33 crc kubenswrapper[4764]: I1204 01:01:33.911606 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.012785 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.013080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.013109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmld8\" (UniqueName: \"kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.013665 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.013914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.036070 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmld8\" (UniqueName: \"kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8\") pod \"dnsmasq-dns-f79bf7859-gwrqw\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.166243 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.589899 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:34 crc kubenswrapper[4764]: I1204 01:01:34.775962 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:01:35 crc kubenswrapper[4764]: I1204 01:01:35.206748 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:35 crc kubenswrapper[4764]: I1204 01:01:35.338964 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerID="eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676" exitCode=0 Dec 04 01:01:35 crc kubenswrapper[4764]: I1204 01:01:35.339017 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" event={"ID":"ad65f8fe-26a5-4700-8a13-a535dbcf5c73","Type":"ContainerDied","Data":"eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676"} Dec 04 01:01:35 crc kubenswrapper[4764]: I1204 01:01:35.339049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" event={"ID":"ad65f8fe-26a5-4700-8a13-a535dbcf5c73","Type":"ContainerStarted","Data":"b1640544bb985c236cb08f4facff874e89cf4fb7748480abd4aa15b094d58f7c"} Dec 04 01:01:35 crc kubenswrapper[4764]: I1204 01:01:35.545549 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:01:35 crc kubenswrapper[4764]: E1204 01:01:35.545888 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:01:36 crc kubenswrapper[4764]: I1204 01:01:36.348859 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" event={"ID":"ad65f8fe-26a5-4700-8a13-a535dbcf5c73","Type":"ContainerStarted","Data":"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31"} Dec 04 01:01:36 crc kubenswrapper[4764]: I1204 01:01:36.349192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:36 crc kubenswrapper[4764]: I1204 01:01:36.376137 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" podStartSLOduration=3.376119357 podStartE2EDuration="3.376119357s" podCreationTimestamp="2025-12-04 01:01:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:01:36.372920719 +0000 UTC m=+4832.134245130" watchObservedRunningTime="2025-12-04 01:01:36.376119357 +0000 UTC m=+4832.137443768" Dec 04 01:01:36 crc kubenswrapper[4764]: I1204 01:01:36.451023 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="rabbitmq" containerID="cri-o://3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3" gracePeriod=604799 Dec 04 01:01:37 crc kubenswrapper[4764]: I1204 01:01:37.076521 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="rabbitmq" containerID="cri-o://1c5451291012ebbad0f14508b09d871bb2c6a2f10a387e6aa173aed7da858179" gracePeriod=604799 Dec 04 01:01:40 crc kubenswrapper[4764]: I1204 01:01:40.318495 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.240:5672: connect: connection refused" Dec 04 01:01:40 crc kubenswrapper[4764]: I1204 01:01:40.733332 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.243:5672: connect: connection refused" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.085136 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.169953 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170022 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170063 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170145 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170213 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170263 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170289 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.170380 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpj9k\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k\") pod \"3803716e-7167-4c44-b89d-c044d119c603\" (UID: \"3803716e-7167-4c44-b89d-c044d119c603\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.176256 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.177906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k" (OuterVolumeSpecName: "kube-api-access-qpj9k") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "kube-api-access-qpj9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.182347 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info" (OuterVolumeSpecName: "pod-info") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.182466 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.183873 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.194494 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.198590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf" (OuterVolumeSpecName: "server-conf") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.204503 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5" (OuterVolumeSpecName: "persistence") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271286 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271316 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271328 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpj9k\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-kube-api-access-qpj9k\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271359 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") on node \"crc\" " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271370 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3803716e-7167-4c44-b89d-c044d119c603-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271378 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3803716e-7167-4c44-b89d-c044d119c603-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271388 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.271396 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3803716e-7167-4c44-b89d-c044d119c603-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.281892 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3803716e-7167-4c44-b89d-c044d119c603" (UID: "3803716e-7167-4c44-b89d-c044d119c603"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.298561 4764 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.298710 4764 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5") on node "crc" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.373635 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3803716e-7167-4c44-b89d-c044d119c603-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.373729 4764 reconciler_common.go:293] "Volume detached for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.423250 4764 generic.go:334] "Generic (PLEG): container finished" podID="3803716e-7167-4c44-b89d-c044d119c603" containerID="3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3" exitCode=0 Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.423321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerDied","Data":"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3"} Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.423336 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.423359 4764 scope.go:117] "RemoveContainer" containerID="3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.423348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3803716e-7167-4c44-b89d-c044d119c603","Type":"ContainerDied","Data":"b509094648b50838b6b1312219ce120b48255a2d5e7098ff83cc740a98e16694"} Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.427128 4764 generic.go:334] "Generic (PLEG): container finished" podID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerID="1c5451291012ebbad0f14508b09d871bb2c6a2f10a387e6aa173aed7da858179" exitCode=0 Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.427164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerDied","Data":"1c5451291012ebbad0f14508b09d871bb2c6a2f10a387e6aa173aed7da858179"} Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.442871 4764 scope.go:117] "RemoveContainer" containerID="a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.455813 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.463483 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.481795 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:43 crc kubenswrapper[4764]: E1204 01:01:43.482134 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="rabbitmq" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.482146 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="rabbitmq" Dec 04 01:01:43 crc kubenswrapper[4764]: E1204 01:01:43.482179 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="setup-container" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.482187 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="setup-container" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.482315 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3803716e-7167-4c44-b89d-c044d119c603" containerName="rabbitmq" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.484131 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.487402 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.488626 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.488631 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.488699 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.488913 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lgr2q" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.505649 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.522366 4764 scope.go:117] "RemoveContainer" containerID="3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3" Dec 04 01:01:43 crc kubenswrapper[4764]: E1204 01:01:43.523992 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3\": container with ID starting with 3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3 not found: ID does not exist" containerID="3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.524055 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3"} err="failed to get container status \"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3\": rpc error: code = NotFound desc = could not find container \"3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3\": container with ID starting with 3cb16a97fdfff3f7e54008d816cf1e0600d07a322bfde0c22cd009ed98854aa3 not found: ID does not exist" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.524086 4764 scope.go:117] "RemoveContainer" containerID="a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25" Dec 04 01:01:43 crc kubenswrapper[4764]: E1204 01:01:43.528000 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25\": container with ID starting with a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25 not found: ID does not exist" containerID="a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.528081 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25"} err="failed to get container status \"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25\": rpc error: code = NotFound desc = could not find container \"a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25\": container with ID starting with a3457a9087abd727e69839b772cac680a704c930b06aa01c6c359afdc2837d25 not found: ID does not exist" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.611535 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5633760a-81af-4a8e-a1d3-9eff9c06ca40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679392 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679773 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48gm\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-kube-api-access-s48gm\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5633760a-81af-4a8e-a1d3-9eff9c06ca40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679903 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.679921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.780972 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781125 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781225 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781358 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781497 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbfx\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\" (UID: \"7c2a3f03-17c5-46d6-b074-76bcdbf93abb\") " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.781999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782081 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s48gm\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-kube-api-access-s48gm\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782131 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5633760a-81af-4a8e-a1d3-9eff9c06ca40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782152 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782219 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782242 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5633760a-81af-4a8e-a1d3-9eff9c06ca40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782299 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782314 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.782932 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.783061 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.783365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.783920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.784559 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5633760a-81af-4a8e-a1d3-9eff9c06ca40-server-conf\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.787146 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info" (OuterVolumeSpecName: "pod-info") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.787504 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.787915 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx" (OuterVolumeSpecName: "kube-api-access-cvbfx") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "kube-api-access-cvbfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.788127 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5633760a-81af-4a8e-a1d3-9eff9c06ca40-pod-info\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.788647 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.795306 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.795453 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/958dc5b9af5392dde11d4148f641f2bf2e13799ac6f1c7ef73161ca8aab9ec68/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.798958 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5633760a-81af-4a8e-a1d3-9eff9c06ca40-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.803198 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d" (OuterVolumeSpecName: "persistence") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "pvc-acc5e681-7ef1-43a0-8521-b9677e25727d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.805517 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48gm\" (UniqueName: \"kubernetes.io/projected/5633760a-81af-4a8e-a1d3-9eff9c06ca40-kube-api-access-s48gm\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.806750 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf" (OuterVolumeSpecName: "server-conf") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.870234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c240816a-f78a-4e3a-88f8-a49dcc1e65a5\") pod \"rabbitmq-server-0\" (UID: \"5633760a-81af-4a8e-a1d3-9eff9c06ca40\") " pod="openstack/rabbitmq-server-0" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.881447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7c2a3f03-17c5-46d6-b074-76bcdbf93abb" (UID: "7c2a3f03-17c5-46d6-b074-76bcdbf93abb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883199 4764 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883227 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbfx\" (UniqueName: \"kubernetes.io/projected/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-kube-api-access-cvbfx\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883238 4764 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883271 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") on node \"crc\" " Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883282 4764 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883291 4764 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.883298 4764 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c2a3f03-17c5-46d6-b074-76bcdbf93abb-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.898873 4764 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.899185 4764 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-acc5e681-7ef1-43a0-8521-b9677e25727d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d") on node "crc" Dec 04 01:01:43 crc kubenswrapper[4764]: I1204 01:01:43.984944 4764 reconciler_common.go:293] "Volume detached for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.123837 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.168243 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.241137 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.241393 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="dnsmasq-dns" containerID="cri-o://e6219b97cc26ca4b52d230f62f58cc4e72c08a1b86db702f558a75bfbef87595" gracePeriod=10 Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.410156 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.436344 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5633760a-81af-4a8e-a1d3-9eff9c06ca40","Type":"ContainerStarted","Data":"d2ebf639ee2bdba4dba90bc2086edf2e89d47991fb56f0a9c1d7b7d7fc3d98a5"} Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.438366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"7c2a3f03-17c5-46d6-b074-76bcdbf93abb","Type":"ContainerDied","Data":"1c49859029cfa7b9a9703652ea2ae76b1695d46ccf63070fd1adc0c1867dc7f7"} Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.438398 4764 scope.go:117] "RemoveContainer" containerID="1c5451291012ebbad0f14508b09d871bb2c6a2f10a387e6aa173aed7da858179" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.438461 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.445541 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerID="e6219b97cc26ca4b52d230f62f58cc4e72c08a1b86db702f558a75bfbef87595" exitCode=0 Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.445583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" event={"ID":"c6359974-fc56-4ce2-97a7-adb26e909a43","Type":"ContainerDied","Data":"e6219b97cc26ca4b52d230f62f58cc4e72c08a1b86db702f558a75bfbef87595"} Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.461694 4764 scope.go:117] "RemoveContainer" containerID="0ced0743f6ac483667d77606480170536d057262dc75e970300d44031007053f" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.502507 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.517047 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.525375 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:44 crc kubenswrapper[4764]: E1204 01:01:44.525805 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="rabbitmq" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.525817 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="rabbitmq" Dec 04 01:01:44 crc kubenswrapper[4764]: E1204 01:01:44.525828 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="setup-container" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.525836 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="setup-container" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.525978 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" containerName="rabbitmq" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.526699 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.529349 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.529494 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.529619 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6dp4n" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.529842 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.530070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.534605 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.584474 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3803716e-7167-4c44-b89d-c044d119c603" path="/var/lib/kubelet/pods/3803716e-7167-4c44-b89d-c044d119c603/volumes" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.585468 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c2a3f03-17c5-46d6-b074-76bcdbf93abb" path="/var/lib/kubelet/pods/7c2a3f03-17c5-46d6-b074-76bcdbf93abb/volumes" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.655760 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698487 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62f1b02a-92d6-496e-b922-2de70fae0f9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698532 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5qx\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-kube-api-access-bx5qx\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698602 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.698676 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.699379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62f1b02a-92d6-496e-b922-2de70fae0f9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.699486 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801097 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config\") pod \"c6359974-fc56-4ce2-97a7-adb26e909a43\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc85v\" (UniqueName: \"kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v\") pod \"c6359974-fc56-4ce2-97a7-adb26e909a43\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801219 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc\") pod \"c6359974-fc56-4ce2-97a7-adb26e909a43\" (UID: \"c6359974-fc56-4ce2-97a7-adb26e909a43\") " Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801496 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62f1b02a-92d6-496e-b922-2de70fae0f9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801633 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62f1b02a-92d6-496e-b922-2de70fae0f9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801658 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801685 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5qx\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-kube-api-access-bx5qx\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801745 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.801772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.802262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.803701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.804053 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.804328 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/62f1b02a-92d6-496e-b922-2de70fae0f9a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.807515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.808317 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.808364 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/23e57f0cc5e6ab422f374ac541df9af06d7e27fee032ef5a63e8a38eec804fcf/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.808345 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/62f1b02a-92d6-496e-b922-2de70fae0f9a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.808527 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/62f1b02a-92d6-496e-b922-2de70fae0f9a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.809361 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v" (OuterVolumeSpecName: "kube-api-access-gc85v") pod "c6359974-fc56-4ce2-97a7-adb26e909a43" (UID: "c6359974-fc56-4ce2-97a7-adb26e909a43"). InnerVolumeSpecName "kube-api-access-gc85v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.824667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5qx\" (UniqueName: \"kubernetes.io/projected/62f1b02a-92d6-496e-b922-2de70fae0f9a-kube-api-access-bx5qx\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.859214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config" (OuterVolumeSpecName: "config") pod "c6359974-fc56-4ce2-97a7-adb26e909a43" (UID: "c6359974-fc56-4ce2-97a7-adb26e909a43"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.862583 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-acc5e681-7ef1-43a0-8521-b9677e25727d\") pod \"rabbitmq-cell1-server-0\" (UID: \"62f1b02a-92d6-496e-b922-2de70fae0f9a\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.868649 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6359974-fc56-4ce2-97a7-adb26e909a43" (UID: "c6359974-fc56-4ce2-97a7-adb26e909a43"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.903011 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.903054 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc85v\" (UniqueName: \"kubernetes.io/projected/c6359974-fc56-4ce2-97a7-adb26e909a43-kube-api-access-gc85v\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:44 crc kubenswrapper[4764]: I1204 01:01:44.903067 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6359974-fc56-4ce2-97a7-adb26e909a43-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.152557 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.455527 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" event={"ID":"c6359974-fc56-4ce2-97a7-adb26e909a43","Type":"ContainerDied","Data":"8d3310c1f6221ba78bbaa28d0d68e4b6acc74f54779347b31e6ffb56675a2ea7"} Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.455923 4764 scope.go:117] "RemoveContainer" containerID="e6219b97cc26ca4b52d230f62f58cc4e72c08a1b86db702f558a75bfbef87595" Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.455574 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cbb4f659c-jrfp8" Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.481913 4764 scope.go:117] "RemoveContainer" containerID="e0d3c0a18dea79a0fc497e2976ff894fee01aaf2c1a065d7faf5425986973c1b" Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.506798 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.515232 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cbb4f659c-jrfp8"] Dec 04 01:01:45 crc kubenswrapper[4764]: W1204 01:01:45.669416 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f1b02a_92d6_496e_b922_2de70fae0f9a.slice/crio-d01910198490766f78d83444c006c4a395a723ec832f8b8f39e43afb4118e281 WatchSource:0}: Error finding container d01910198490766f78d83444c006c4a395a723ec832f8b8f39e43afb4118e281: Status 404 returned error can't find the container with id d01910198490766f78d83444c006c4a395a723ec832f8b8f39e43afb4118e281 Dec 04 01:01:45 crc kubenswrapper[4764]: I1204 01:01:45.671055 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 01:01:46 crc kubenswrapper[4764]: I1204 01:01:46.472501 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62f1b02a-92d6-496e-b922-2de70fae0f9a","Type":"ContainerStarted","Data":"d01910198490766f78d83444c006c4a395a723ec832f8b8f39e43afb4118e281"} Dec 04 01:01:46 crc kubenswrapper[4764]: I1204 01:01:46.561653 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" path="/var/lib/kubelet/pods/c6359974-fc56-4ce2-97a7-adb26e909a43/volumes" Dec 04 01:01:47 crc kubenswrapper[4764]: I1204 01:01:47.546443 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:01:47 crc kubenswrapper[4764]: E1204 01:01:47.547309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:01:48 crc kubenswrapper[4764]: I1204 01:01:48.499439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62f1b02a-92d6-496e-b922-2de70fae0f9a","Type":"ContainerStarted","Data":"af7331b2cccb2a15347d6a408b4361094d3c434a9c2a5a594acfbc0f19967bb3"} Dec 04 01:01:48 crc kubenswrapper[4764]: I1204 01:01:48.503530 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5633760a-81af-4a8e-a1d3-9eff9c06ca40","Type":"ContainerStarted","Data":"e75add6141bac59a27bcbffa13158cb3acc1a57b7df3f088209e590943968b2c"} Dec 04 01:01:58 crc kubenswrapper[4764]: I1204 01:01:58.547027 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:01:58 crc kubenswrapper[4764]: E1204 01:01:58.548484 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:02:09 crc kubenswrapper[4764]: I1204 01:02:09.546411 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:02:09 crc kubenswrapper[4764]: E1204 01:02:09.547767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:02:21 crc kubenswrapper[4764]: I1204 01:02:21.546492 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:02:21 crc kubenswrapper[4764]: E1204 01:02:21.552470 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:02:21 crc kubenswrapper[4764]: I1204 01:02:21.821571 4764 generic.go:334] "Generic (PLEG): container finished" podID="5633760a-81af-4a8e-a1d3-9eff9c06ca40" containerID="e75add6141bac59a27bcbffa13158cb3acc1a57b7df3f088209e590943968b2c" exitCode=0 Dec 04 01:02:21 crc kubenswrapper[4764]: I1204 01:02:21.821630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5633760a-81af-4a8e-a1d3-9eff9c06ca40","Type":"ContainerDied","Data":"e75add6141bac59a27bcbffa13158cb3acc1a57b7df3f088209e590943968b2c"} Dec 04 01:02:21 crc kubenswrapper[4764]: I1204 01:02:21.823710 4764 generic.go:334] "Generic (PLEG): container finished" podID="62f1b02a-92d6-496e-b922-2de70fae0f9a" containerID="af7331b2cccb2a15347d6a408b4361094d3c434a9c2a5a594acfbc0f19967bb3" exitCode=0 Dec 04 01:02:21 crc kubenswrapper[4764]: I1204 01:02:21.823787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62f1b02a-92d6-496e-b922-2de70fae0f9a","Type":"ContainerDied","Data":"af7331b2cccb2a15347d6a408b4361094d3c434a9c2a5a594acfbc0f19967bb3"} Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.835491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"5633760a-81af-4a8e-a1d3-9eff9c06ca40","Type":"ContainerStarted","Data":"1c794c7b1474a8b6059bfb8bc1836d58681863f08b7e2361a2c3f57cf2ecbb27"} Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.836146 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.837751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"62f1b02a-92d6-496e-b922-2de70fae0f9a","Type":"ContainerStarted","Data":"a56b42b7138c345af2dd8814296047f0022c1ac74a7a3f9c3825a1b980d0db30"} Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.838066 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.870982 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.870957537 podStartE2EDuration="39.870957537s" podCreationTimestamp="2025-12-04 01:01:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:02:22.859565896 +0000 UTC m=+4878.620890397" watchObservedRunningTime="2025-12-04 01:02:22.870957537 +0000 UTC m=+4878.632281958" Dec 04 01:02:22 crc kubenswrapper[4764]: I1204 01:02:22.891576 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.891548844 podStartE2EDuration="38.891548844s" podCreationTimestamp="2025-12-04 01:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:02:22.883131757 +0000 UTC m=+4878.644456228" watchObservedRunningTime="2025-12-04 01:02:22.891548844 +0000 UTC m=+4878.652873265" Dec 04 01:02:34 crc kubenswrapper[4764]: I1204 01:02:34.126980 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 01:02:35 crc kubenswrapper[4764]: I1204 01:02:35.157229 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 01:02:36 crc kubenswrapper[4764]: I1204 01:02:36.546429 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:02:36 crc kubenswrapper[4764]: I1204 01:02:36.967266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147"} Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.405054 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 04 01:02:46 crc kubenswrapper[4764]: E1204 01:02:46.406218 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="init" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.406241 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="init" Dec 04 01:02:46 crc kubenswrapper[4764]: E1204 01:02:46.406275 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="dnsmasq-dns" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.406288 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="dnsmasq-dns" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.406628 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6359974-fc56-4ce2-97a7-adb26e909a43" containerName="dnsmasq-dns" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.407502 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.410790 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jvh67" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.416317 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.539353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44lx\" (UniqueName: \"kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx\") pod \"mariadb-client-1-default\" (UID: \"11320cae-39e0-4077-b669-9866a661273e\") " pod="openstack/mariadb-client-1-default" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.641154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44lx\" (UniqueName: \"kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx\") pod \"mariadb-client-1-default\" (UID: \"11320cae-39e0-4077-b669-9866a661273e\") " pod="openstack/mariadb-client-1-default" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.663014 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44lx\" (UniqueName: \"kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx\") pod \"mariadb-client-1-default\" (UID: \"11320cae-39e0-4077-b669-9866a661273e\") " pod="openstack/mariadb-client-1-default" Dec 04 01:02:46 crc kubenswrapper[4764]: I1204 01:02:46.736063 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 04 01:02:47 crc kubenswrapper[4764]: I1204 01:02:47.322621 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 04 01:02:47 crc kubenswrapper[4764]: I1204 01:02:47.327854 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:02:48 crc kubenswrapper[4764]: I1204 01:02:48.083084 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"11320cae-39e0-4077-b669-9866a661273e","Type":"ContainerStarted","Data":"3db57e10925e580a7d0c61122fbebed338974f24e1e834f1d5be41fb02bd3044"} Dec 04 01:02:49 crc kubenswrapper[4764]: I1204 01:02:49.094812 4764 generic.go:334] "Generic (PLEG): container finished" podID="11320cae-39e0-4077-b669-9866a661273e" containerID="908f8882c13fa4ddd89baecf281671248a003598034ff7da8bf7e2c50f753f3b" exitCode=0 Dec 04 01:02:49 crc kubenswrapper[4764]: I1204 01:02:49.094911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"11320cae-39e0-4077-b669-9866a661273e","Type":"ContainerDied","Data":"908f8882c13fa4ddd89baecf281671248a003598034ff7da8bf7e2c50f753f3b"} Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.496526 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.528783 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_11320cae-39e0-4077-b669-9866a661273e/mariadb-client-1-default/0.log" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.563080 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.563119 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.602840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s44lx\" (UniqueName: \"kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx\") pod \"11320cae-39e0-4077-b669-9866a661273e\" (UID: \"11320cae-39e0-4077-b669-9866a661273e\") " Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.611842 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx" (OuterVolumeSpecName: "kube-api-access-s44lx") pod "11320cae-39e0-4077-b669-9866a661273e" (UID: "11320cae-39e0-4077-b669-9866a661273e"). InnerVolumeSpecName "kube-api-access-s44lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.704925 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s44lx\" (UniqueName: \"kubernetes.io/projected/11320cae-39e0-4077-b669-9866a661273e-kube-api-access-s44lx\") on node \"crc\" DevicePath \"\"" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.942301 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 04 01:02:50 crc kubenswrapper[4764]: E1204 01:02:50.943032 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11320cae-39e0-4077-b669-9866a661273e" containerName="mariadb-client-1-default" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.943055 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="11320cae-39e0-4077-b669-9866a661273e" containerName="mariadb-client-1-default" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.943238 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="11320cae-39e0-4077-b669-9866a661273e" containerName="mariadb-client-1-default" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.943848 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 04 01:02:50 crc kubenswrapper[4764]: I1204 01:02:50.956411 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.010689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjzcl\" (UniqueName: \"kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl\") pod \"mariadb-client-2-default\" (UID: \"6059e8f2-a981-42a8-91a2-b619d812677e\") " pod="openstack/mariadb-client-2-default" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.111116 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db57e10925e580a7d0c61122fbebed338974f24e1e834f1d5be41fb02bd3044" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.111245 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.115043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjzcl\" (UniqueName: \"kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl\") pod \"mariadb-client-2-default\" (UID: \"6059e8f2-a981-42a8-91a2-b619d812677e\") " pod="openstack/mariadb-client-2-default" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.134443 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjzcl\" (UniqueName: \"kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl\") pod \"mariadb-client-2-default\" (UID: \"6059e8f2-a981-42a8-91a2-b619d812677e\") " pod="openstack/mariadb-client-2-default" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.267878 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 04 01:02:51 crc kubenswrapper[4764]: I1204 01:02:51.594515 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 04 01:02:51 crc kubenswrapper[4764]: W1204 01:02:51.967687 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6059e8f2_a981_42a8_91a2_b619d812677e.slice/crio-72fca186c554b0495041b7129ec2a3bb9ab1a353d867e45d6255eddb68892537 WatchSource:0}: Error finding container 72fca186c554b0495041b7129ec2a3bb9ab1a353d867e45d6255eddb68892537: Status 404 returned error can't find the container with id 72fca186c554b0495041b7129ec2a3bb9ab1a353d867e45d6255eddb68892537 Dec 04 01:02:52 crc kubenswrapper[4764]: I1204 01:02:52.123078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6059e8f2-a981-42a8-91a2-b619d812677e","Type":"ContainerStarted","Data":"72fca186c554b0495041b7129ec2a3bb9ab1a353d867e45d6255eddb68892537"} Dec 04 01:02:52 crc kubenswrapper[4764]: I1204 01:02:52.556963 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11320cae-39e0-4077-b669-9866a661273e" path="/var/lib/kubelet/pods/11320cae-39e0-4077-b669-9866a661273e/volumes" Dec 04 01:02:53 crc kubenswrapper[4764]: I1204 01:02:53.132597 4764 generic.go:334] "Generic (PLEG): container finished" podID="6059e8f2-a981-42a8-91a2-b619d812677e" containerID="28ebd4cdf85bce048ab2c5ee28648941d0806d8c732311ca155d5cccaef0ecef" exitCode=1 Dec 04 01:02:53 crc kubenswrapper[4764]: I1204 01:02:53.132679 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"6059e8f2-a981-42a8-91a2-b619d812677e","Type":"ContainerDied","Data":"28ebd4cdf85bce048ab2c5ee28648941d0806d8c732311ca155d5cccaef0ecef"} Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.493539 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.514867 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_6059e8f2-a981-42a8-91a2-b619d812677e/mariadb-client-2-default/0.log" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.570192 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjzcl\" (UniqueName: \"kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl\") pod \"6059e8f2-a981-42a8-91a2-b619d812677e\" (UID: \"6059e8f2-a981-42a8-91a2-b619d812677e\") " Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.575173 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.578111 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl" (OuterVolumeSpecName: "kube-api-access-hjzcl") pod "6059e8f2-a981-42a8-91a2-b619d812677e" (UID: "6059e8f2-a981-42a8-91a2-b619d812677e"). InnerVolumeSpecName "kube-api-access-hjzcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.580335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.679074 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjzcl\" (UniqueName: \"kubernetes.io/projected/6059e8f2-a981-42a8-91a2-b619d812677e-kube-api-access-hjzcl\") on node \"crc\" DevicePath \"\"" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.956425 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 04 01:02:54 crc kubenswrapper[4764]: E1204 01:02:54.957128 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6059e8f2-a981-42a8-91a2-b619d812677e" containerName="mariadb-client-2-default" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.957178 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6059e8f2-a981-42a8-91a2-b619d812677e" containerName="mariadb-client-2-default" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.958226 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6059e8f2-a981-42a8-91a2-b619d812677e" containerName="mariadb-client-2-default" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.960232 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 04 01:02:54 crc kubenswrapper[4764]: I1204 01:02:54.965835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.088240 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjlx\" (UniqueName: \"kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx\") pod \"mariadb-client-1\" (UID: \"febcc604-ee7a-4e25-84cc-cc43db4f5af5\") " pod="openstack/mariadb-client-1" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.153573 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72fca186c554b0495041b7129ec2a3bb9ab1a353d867e45d6255eddb68892537" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.153617 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.190471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjlx\" (UniqueName: \"kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx\") pod \"mariadb-client-1\" (UID: \"febcc604-ee7a-4e25-84cc-cc43db4f5af5\") " pod="openstack/mariadb-client-1" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.221872 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjlx\" (UniqueName: \"kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx\") pod \"mariadb-client-1\" (UID: \"febcc604-ee7a-4e25-84cc-cc43db4f5af5\") " pod="openstack/mariadb-client-1" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.288422 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 04 01:02:55 crc kubenswrapper[4764]: I1204 01:02:55.890598 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 04 01:02:55 crc kubenswrapper[4764]: W1204 01:02:55.898776 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfebcc604_ee7a_4e25_84cc_cc43db4f5af5.slice/crio-20c55ac6095a13a17db92c4072b10fbadfffa3c2fff9b251b4ddd99033e1da13 WatchSource:0}: Error finding container 20c55ac6095a13a17db92c4072b10fbadfffa3c2fff9b251b4ddd99033e1da13: Status 404 returned error can't find the container with id 20c55ac6095a13a17db92c4072b10fbadfffa3c2fff9b251b4ddd99033e1da13 Dec 04 01:02:56 crc kubenswrapper[4764]: I1204 01:02:56.167335 4764 generic.go:334] "Generic (PLEG): container finished" podID="febcc604-ee7a-4e25-84cc-cc43db4f5af5" containerID="d8f9c8a8d8cb9e51294cefd1e2a2697e5b8e5008a429d09d07e46c613616d2a9" exitCode=0 Dec 04 01:02:56 crc kubenswrapper[4764]: I1204 01:02:56.167391 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"febcc604-ee7a-4e25-84cc-cc43db4f5af5","Type":"ContainerDied","Data":"d8f9c8a8d8cb9e51294cefd1e2a2697e5b8e5008a429d09d07e46c613616d2a9"} Dec 04 01:02:56 crc kubenswrapper[4764]: I1204 01:02:56.167422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"febcc604-ee7a-4e25-84cc-cc43db4f5af5","Type":"ContainerStarted","Data":"20c55ac6095a13a17db92c4072b10fbadfffa3c2fff9b251b4ddd99033e1da13"} Dec 04 01:02:56 crc kubenswrapper[4764]: I1204 01:02:56.570443 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6059e8f2-a981-42a8-91a2-b619d812677e" path="/var/lib/kubelet/pods/6059e8f2-a981-42a8-91a2-b619d812677e/volumes" Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.618679 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.641286 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_febcc604-ee7a-4e25-84cc-cc43db4f5af5/mariadb-client-1/0.log" Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.665531 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.672018 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.731146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjlx\" (UniqueName: \"kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx\") pod \"febcc604-ee7a-4e25-84cc-cc43db4f5af5\" (UID: \"febcc604-ee7a-4e25-84cc-cc43db4f5af5\") " Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.738350 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx" (OuterVolumeSpecName: "kube-api-access-fjjlx") pod "febcc604-ee7a-4e25-84cc-cc43db4f5af5" (UID: "febcc604-ee7a-4e25-84cc-cc43db4f5af5"). InnerVolumeSpecName "kube-api-access-fjjlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:02:57 crc kubenswrapper[4764]: I1204 01:02:57.832833 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjlx\" (UniqueName: \"kubernetes.io/projected/febcc604-ee7a-4e25-84cc-cc43db4f5af5-kube-api-access-fjjlx\") on node \"crc\" DevicePath \"\"" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.148943 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 04 01:02:58 crc kubenswrapper[4764]: E1204 01:02:58.149449 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="febcc604-ee7a-4e25-84cc-cc43db4f5af5" containerName="mariadb-client-1" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.149481 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="febcc604-ee7a-4e25-84cc-cc43db4f5af5" containerName="mariadb-client-1" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.149781 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="febcc604-ee7a-4e25-84cc-cc43db4f5af5" containerName="mariadb-client-1" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.150562 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.163488 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.193546 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c55ac6095a13a17db92c4072b10fbadfffa3c2fff9b251b4ddd99033e1da13" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.193602 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.238680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpgr7\" (UniqueName: \"kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7\") pod \"mariadb-client-4-default\" (UID: \"bc0f5488-de3d-47f4-a5b2-4efe86710df6\") " pod="openstack/mariadb-client-4-default" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.340106 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpgr7\" (UniqueName: \"kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7\") pod \"mariadb-client-4-default\" (UID: \"bc0f5488-de3d-47f4-a5b2-4efe86710df6\") " pod="openstack/mariadb-client-4-default" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.375681 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpgr7\" (UniqueName: \"kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7\") pod \"mariadb-client-4-default\" (UID: \"bc0f5488-de3d-47f4-a5b2-4efe86710df6\") " pod="openstack/mariadb-client-4-default" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.480608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 04 01:02:58 crc kubenswrapper[4764]: I1204 01:02:58.560531 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="febcc604-ee7a-4e25-84cc-cc43db4f5af5" path="/var/lib/kubelet/pods/febcc604-ee7a-4e25-84cc-cc43db4f5af5/volumes" Dec 04 01:02:59 crc kubenswrapper[4764]: I1204 01:02:59.040566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 04 01:02:59 crc kubenswrapper[4764]: W1204 01:02:59.041312 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0f5488_de3d_47f4_a5b2_4efe86710df6.slice/crio-b663d9d02ded5a3b4e77e7818d41fa42f99ef10110224de48300f4e749b546de WatchSource:0}: Error finding container b663d9d02ded5a3b4e77e7818d41fa42f99ef10110224de48300f4e749b546de: Status 404 returned error can't find the container with id b663d9d02ded5a3b4e77e7818d41fa42f99ef10110224de48300f4e749b546de Dec 04 01:02:59 crc kubenswrapper[4764]: I1204 01:02:59.202460 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"bc0f5488-de3d-47f4-a5b2-4efe86710df6","Type":"ContainerStarted","Data":"b663d9d02ded5a3b4e77e7818d41fa42f99ef10110224de48300f4e749b546de"} Dec 04 01:02:59 crc kubenswrapper[4764]: E1204 01:02:59.297186 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0f5488_de3d_47f4_a5b2_4efe86710df6.slice/crio-22e38ce74dfe448caac44360669f607fb0bf2641aadafa2f34574f3b12140511.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc0f5488_de3d_47f4_a5b2_4efe86710df6.slice/crio-conmon-22e38ce74dfe448caac44360669f607fb0bf2641aadafa2f34574f3b12140511.scope\": RecentStats: unable to find data in memory cache]" Dec 04 01:03:00 crc kubenswrapper[4764]: I1204 01:03:00.214489 4764 generic.go:334] "Generic (PLEG): container finished" podID="bc0f5488-de3d-47f4-a5b2-4efe86710df6" containerID="22e38ce74dfe448caac44360669f607fb0bf2641aadafa2f34574f3b12140511" exitCode=0 Dec 04 01:03:00 crc kubenswrapper[4764]: I1204 01:03:00.214563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"bc0f5488-de3d-47f4-a5b2-4efe86710df6","Type":"ContainerDied","Data":"22e38ce74dfe448caac44360669f607fb0bf2641aadafa2f34574f3b12140511"} Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.706366 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.727020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpgr7\" (UniqueName: \"kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7\") pod \"bc0f5488-de3d-47f4-a5b2-4efe86710df6\" (UID: \"bc0f5488-de3d-47f4-a5b2-4efe86710df6\") " Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.731770 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_bc0f5488-de3d-47f4-a5b2-4efe86710df6/mariadb-client-4-default/0.log" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.740302 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7" (OuterVolumeSpecName: "kube-api-access-jpgr7") pod "bc0f5488-de3d-47f4-a5b2-4efe86710df6" (UID: "bc0f5488-de3d-47f4-a5b2-4efe86710df6"). InnerVolumeSpecName "kube-api-access-jpgr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.763767 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.774696 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:01.832010 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpgr7\" (UniqueName: \"kubernetes.io/projected/bc0f5488-de3d-47f4-a5b2-4efe86710df6-kube-api-access-jpgr7\") on node \"crc\" DevicePath \"\"" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:02.241534 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b663d9d02ded5a3b4e77e7818d41fa42f99ef10110224de48300f4e749b546de" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:02.241626 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 04 01:03:02 crc kubenswrapper[4764]: I1204 01:03:02.564517 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0f5488-de3d-47f4-a5b2-4efe86710df6" path="/var/lib/kubelet/pods/bc0f5488-de3d-47f4-a5b2-4efe86710df6/volumes" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.059759 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 04 01:03:06 crc kubenswrapper[4764]: E1204 01:03:06.060998 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0f5488-de3d-47f4-a5b2-4efe86710df6" containerName="mariadb-client-4-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.061028 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0f5488-de3d-47f4-a5b2-4efe86710df6" containerName="mariadb-client-4-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.061368 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0f5488-de3d-47f4-a5b2-4efe86710df6" containerName="mariadb-client-4-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.062502 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.065253 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jvh67" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.068857 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.207947 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8chx\" (UniqueName: \"kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx\") pod \"mariadb-client-5-default\" (UID: \"2435c6e0-c417-45ac-8c90-caae3bedaf4a\") " pod="openstack/mariadb-client-5-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.309810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8chx\" (UniqueName: \"kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx\") pod \"mariadb-client-5-default\" (UID: \"2435c6e0-c417-45ac-8c90-caae3bedaf4a\") " pod="openstack/mariadb-client-5-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.355003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8chx\" (UniqueName: \"kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx\") pod \"mariadb-client-5-default\" (UID: \"2435c6e0-c417-45ac-8c90-caae3bedaf4a\") " pod="openstack/mariadb-client-5-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.383737 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 04 01:03:06 crc kubenswrapper[4764]: I1204 01:03:06.742849 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 04 01:03:07 crc kubenswrapper[4764]: I1204 01:03:07.300691 4764 generic.go:334] "Generic (PLEG): container finished" podID="2435c6e0-c417-45ac-8c90-caae3bedaf4a" containerID="f0205ce0ff190f5c7c123e36859f71da3fd06e06a9f6d74071470f50ae17ed84" exitCode=0 Dec 04 01:03:07 crc kubenswrapper[4764]: I1204 01:03:07.301170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"2435c6e0-c417-45ac-8c90-caae3bedaf4a","Type":"ContainerDied","Data":"f0205ce0ff190f5c7c123e36859f71da3fd06e06a9f6d74071470f50ae17ed84"} Dec 04 01:03:07 crc kubenswrapper[4764]: I1204 01:03:07.301219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"2435c6e0-c417-45ac-8c90-caae3bedaf4a","Type":"ContainerStarted","Data":"fa8754eaf4e54e797f665cc5f3542626c40e75869c0274cdece99d3aa234321a"} Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.808422 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.831270 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_2435c6e0-c417-45ac-8c90-caae3bedaf4a/mariadb-client-5-default/0.log" Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.867521 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.876668 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.958500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8chx\" (UniqueName: \"kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx\") pod \"2435c6e0-c417-45ac-8c90-caae3bedaf4a\" (UID: \"2435c6e0-c417-45ac-8c90-caae3bedaf4a\") " Dec 04 01:03:08 crc kubenswrapper[4764]: I1204 01:03:08.968870 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx" (OuterVolumeSpecName: "kube-api-access-c8chx") pod "2435c6e0-c417-45ac-8c90-caae3bedaf4a" (UID: "2435c6e0-c417-45ac-8c90-caae3bedaf4a"). InnerVolumeSpecName "kube-api-access-c8chx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.063350 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8chx\" (UniqueName: \"kubernetes.io/projected/2435c6e0-c417-45ac-8c90-caae3bedaf4a-kube-api-access-c8chx\") on node \"crc\" DevicePath \"\"" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.075821 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 04 01:03:09 crc kubenswrapper[4764]: E1204 01:03:09.076920 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2435c6e0-c417-45ac-8c90-caae3bedaf4a" containerName="mariadb-client-5-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.076951 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2435c6e0-c417-45ac-8c90-caae3bedaf4a" containerName="mariadb-client-5-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.077161 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2435c6e0-c417-45ac-8c90-caae3bedaf4a" containerName="mariadb-client-5-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.077884 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.088149 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.266458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwb7c\" (UniqueName: \"kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c\") pod \"mariadb-client-6-default\" (UID: \"8baa1ec9-5a31-4ba3-8ab2-a54060824344\") " pod="openstack/mariadb-client-6-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.323006 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa8754eaf4e54e797f665cc5f3542626c40e75869c0274cdece99d3aa234321a" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.323271 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.368652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwb7c\" (UniqueName: \"kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c\") pod \"mariadb-client-6-default\" (UID: \"8baa1ec9-5a31-4ba3-8ab2-a54060824344\") " pod="openstack/mariadb-client-6-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.399683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwb7c\" (UniqueName: \"kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c\") pod \"mariadb-client-6-default\" (UID: \"8baa1ec9-5a31-4ba3-8ab2-a54060824344\") " pod="openstack/mariadb-client-6-default" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.417830 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 04 01:03:09 crc kubenswrapper[4764]: E1204 01:03:09.544098 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2435c6e0_c417_45ac_8c90_caae3bedaf4a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2435c6e0_c417_45ac_8c90_caae3bedaf4a.slice/crio-fa8754eaf4e54e797f665cc5f3542626c40e75869c0274cdece99d3aa234321a\": RecentStats: unable to find data in memory cache]" Dec 04 01:03:09 crc kubenswrapper[4764]: I1204 01:03:09.776119 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 04 01:03:09 crc kubenswrapper[4764]: W1204 01:03:09.784702 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8baa1ec9_5a31_4ba3_8ab2_a54060824344.slice/crio-a03881837d5b0c8ffbe37b1f958a6da71aa6d3ff75db08882effb4792a3477c1 WatchSource:0}: Error finding container a03881837d5b0c8ffbe37b1f958a6da71aa6d3ff75db08882effb4792a3477c1: Status 404 returned error can't find the container with id a03881837d5b0c8ffbe37b1f958a6da71aa6d3ff75db08882effb4792a3477c1 Dec 04 01:03:10 crc kubenswrapper[4764]: I1204 01:03:10.257656 4764 scope.go:117] "RemoveContainer" containerID="66ae9d1d2cdeee9cbc403c41827d641c7dd579109c0c30a40637308370bfe30b" Dec 04 01:03:10 crc kubenswrapper[4764]: I1204 01:03:10.336889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8baa1ec9-5a31-4ba3-8ab2-a54060824344","Type":"ContainerStarted","Data":"f855c84a45f3eec1b4c42f662fe44f778e141827a5519202687e2577db5bb614"} Dec 04 01:03:10 crc kubenswrapper[4764]: I1204 01:03:10.336960 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8baa1ec9-5a31-4ba3-8ab2-a54060824344","Type":"ContainerStarted","Data":"a03881837d5b0c8ffbe37b1f958a6da71aa6d3ff75db08882effb4792a3477c1"} Dec 04 01:03:10 crc kubenswrapper[4764]: I1204 01:03:10.353852 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.353829621 podStartE2EDuration="1.353829621s" podCreationTimestamp="2025-12-04 01:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:03:10.349916455 +0000 UTC m=+4926.111240876" watchObservedRunningTime="2025-12-04 01:03:10.353829621 +0000 UTC m=+4926.115154032" Dec 04 01:03:10 crc kubenswrapper[4764]: I1204 01:03:10.562435 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2435c6e0-c417-45ac-8c90-caae3bedaf4a" path="/var/lib/kubelet/pods/2435c6e0-c417-45ac-8c90-caae3bedaf4a/volumes" Dec 04 01:03:11 crc kubenswrapper[4764]: I1204 01:03:11.351599 4764 generic.go:334] "Generic (PLEG): container finished" podID="8baa1ec9-5a31-4ba3-8ab2-a54060824344" containerID="f855c84a45f3eec1b4c42f662fe44f778e141827a5519202687e2577db5bb614" exitCode=1 Dec 04 01:03:11 crc kubenswrapper[4764]: I1204 01:03:11.351643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"8baa1ec9-5a31-4ba3-8ab2-a54060824344","Type":"ContainerDied","Data":"f855c84a45f3eec1b4c42f662fe44f778e141827a5519202687e2577db5bb614"} Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.757573 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.800179 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.805804 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.938896 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 04 01:03:12 crc kubenswrapper[4764]: E1204 01:03:12.939242 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baa1ec9-5a31-4ba3-8ab2-a54060824344" containerName="mariadb-client-6-default" Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.939263 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baa1ec9-5a31-4ba3-8ab2-a54060824344" containerName="mariadb-client-6-default" Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.939498 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baa1ec9-5a31-4ba3-8ab2-a54060824344" containerName="mariadb-client-6-default" Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.940191 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.940481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwb7c\" (UniqueName: \"kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c\") pod \"8baa1ec9-5a31-4ba3-8ab2-a54060824344\" (UID: \"8baa1ec9-5a31-4ba3-8ab2-a54060824344\") " Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.951302 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 04 01:03:12 crc kubenswrapper[4764]: I1204 01:03:12.981065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c" (OuterVolumeSpecName: "kube-api-access-rwb7c") pod "8baa1ec9-5a31-4ba3-8ab2-a54060824344" (UID: "8baa1ec9-5a31-4ba3-8ab2-a54060824344"). InnerVolumeSpecName "kube-api-access-rwb7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.042458 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2lnm\" (UniqueName: \"kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm\") pod \"mariadb-client-7-default\" (UID: \"34edebb5-74de-4906-93e5-21bebf1e3c59\") " pod="openstack/mariadb-client-7-default" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.042660 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwb7c\" (UniqueName: \"kubernetes.io/projected/8baa1ec9-5a31-4ba3-8ab2-a54060824344-kube-api-access-rwb7c\") on node \"crc\" DevicePath \"\"" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.144479 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2lnm\" (UniqueName: \"kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm\") pod \"mariadb-client-7-default\" (UID: \"34edebb5-74de-4906-93e5-21bebf1e3c59\") " pod="openstack/mariadb-client-7-default" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.174099 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2lnm\" (UniqueName: \"kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm\") pod \"mariadb-client-7-default\" (UID: \"34edebb5-74de-4906-93e5-21bebf1e3c59\") " pod="openstack/mariadb-client-7-default" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.319167 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.377426 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a03881837d5b0c8ffbe37b1f958a6da71aa6d3ff75db08882effb4792a3477c1" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.377484 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 04 01:03:13 crc kubenswrapper[4764]: I1204 01:03:13.753097 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 04 01:03:14 crc kubenswrapper[4764]: I1204 01:03:14.387412 4764 generic.go:334] "Generic (PLEG): container finished" podID="34edebb5-74de-4906-93e5-21bebf1e3c59" containerID="3e3b3228d7c0b06eae8fb8c71e5d0679baa49e0a4308994d466d9095bf93530d" exitCode=0 Dec 04 01:03:14 crc kubenswrapper[4764]: I1204 01:03:14.387625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"34edebb5-74de-4906-93e5-21bebf1e3c59","Type":"ContainerDied","Data":"3e3b3228d7c0b06eae8fb8c71e5d0679baa49e0a4308994d466d9095bf93530d"} Dec 04 01:03:14 crc kubenswrapper[4764]: I1204 01:03:14.387891 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"34edebb5-74de-4906-93e5-21bebf1e3c59","Type":"ContainerStarted","Data":"900843dcd89163db4ff837701083d14a54f15542e5f6b675169cef7813b579b0"} Dec 04 01:03:14 crc kubenswrapper[4764]: I1204 01:03:14.561053 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8baa1ec9-5a31-4ba3-8ab2-a54060824344" path="/var/lib/kubelet/pods/8baa1ec9-5a31-4ba3-8ab2-a54060824344/volumes" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.348550 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.367081 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_34edebb5-74de-4906-93e5-21bebf1e3c59/mariadb-client-7-default/0.log" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.393869 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.403622 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.412403 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900843dcd89163db4ff837701083d14a54f15542e5f6b675169cef7813b579b0" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.412484 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.495520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2lnm\" (UniqueName: \"kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm\") pod \"34edebb5-74de-4906-93e5-21bebf1e3c59\" (UID: \"34edebb5-74de-4906-93e5-21bebf1e3c59\") " Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.510140 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm" (OuterVolumeSpecName: "kube-api-access-g2lnm") pod "34edebb5-74de-4906-93e5-21bebf1e3c59" (UID: "34edebb5-74de-4906-93e5-21bebf1e3c59"). InnerVolumeSpecName "kube-api-access-g2lnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.534779 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 04 01:03:16 crc kubenswrapper[4764]: E1204 01:03:16.535170 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34edebb5-74de-4906-93e5-21bebf1e3c59" containerName="mariadb-client-7-default" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.535192 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="34edebb5-74de-4906-93e5-21bebf1e3c59" containerName="mariadb-client-7-default" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.535383 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="34edebb5-74de-4906-93e5-21bebf1e3c59" containerName="mariadb-client-7-default" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.536021 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.543645 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.577552 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34edebb5-74de-4906-93e5-21bebf1e3c59" path="/var/lib/kubelet/pods/34edebb5-74de-4906-93e5-21bebf1e3c59/volumes" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.597020 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2lnm\" (UniqueName: \"kubernetes.io/projected/34edebb5-74de-4906-93e5-21bebf1e3c59-kube-api-access-g2lnm\") on node \"crc\" DevicePath \"\"" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.698913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4hzs\" (UniqueName: \"kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs\") pod \"mariadb-client-2\" (UID: \"aa09a231-7a04-4851-87b4-0f7641094c88\") " pod="openstack/mariadb-client-2" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.800778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4hzs\" (UniqueName: \"kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs\") pod \"mariadb-client-2\" (UID: \"aa09a231-7a04-4851-87b4-0f7641094c88\") " pod="openstack/mariadb-client-2" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.830331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4hzs\" (UniqueName: \"kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs\") pod \"mariadb-client-2\" (UID: \"aa09a231-7a04-4851-87b4-0f7641094c88\") " pod="openstack/mariadb-client-2" Dec 04 01:03:16 crc kubenswrapper[4764]: I1204 01:03:16.905425 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 04 01:03:17 crc kubenswrapper[4764]: I1204 01:03:17.584036 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 04 01:03:17 crc kubenswrapper[4764]: W1204 01:03:17.871836 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa09a231_7a04_4851_87b4_0f7641094c88.slice/crio-6a2448efea006c01dfa7bda8750346dd53ea7f4531e836b80bf5fa2d0ef9aff9 WatchSource:0}: Error finding container 6a2448efea006c01dfa7bda8750346dd53ea7f4531e836b80bf5fa2d0ef9aff9: Status 404 returned error can't find the container with id 6a2448efea006c01dfa7bda8750346dd53ea7f4531e836b80bf5fa2d0ef9aff9 Dec 04 01:03:18 crc kubenswrapper[4764]: I1204 01:03:18.436663 4764 generic.go:334] "Generic (PLEG): container finished" podID="aa09a231-7a04-4851-87b4-0f7641094c88" containerID="62a380c714e439dbe38ee9cef470b3ae135a7b7b820b683930771039d93cd660" exitCode=0 Dec 04 01:03:18 crc kubenswrapper[4764]: I1204 01:03:18.436853 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"aa09a231-7a04-4851-87b4-0f7641094c88","Type":"ContainerDied","Data":"62a380c714e439dbe38ee9cef470b3ae135a7b7b820b683930771039d93cd660"} Dec 04 01:03:18 crc kubenswrapper[4764]: I1204 01:03:18.437027 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"aa09a231-7a04-4851-87b4-0f7641094c88","Type":"ContainerStarted","Data":"6a2448efea006c01dfa7bda8750346dd53ea7f4531e836b80bf5fa2d0ef9aff9"} Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.844709 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.865861 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_aa09a231-7a04-4851-87b4-0f7641094c88/mariadb-client-2/0.log" Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.896045 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.903710 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.953619 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4hzs\" (UniqueName: \"kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs\") pod \"aa09a231-7a04-4851-87b4-0f7641094c88\" (UID: \"aa09a231-7a04-4851-87b4-0f7641094c88\") " Dec 04 01:03:19 crc kubenswrapper[4764]: I1204 01:03:19.964107 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs" (OuterVolumeSpecName: "kube-api-access-q4hzs") pod "aa09a231-7a04-4851-87b4-0f7641094c88" (UID: "aa09a231-7a04-4851-87b4-0f7641094c88"). InnerVolumeSpecName "kube-api-access-q4hzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:03:20 crc kubenswrapper[4764]: I1204 01:03:20.056030 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4hzs\" (UniqueName: \"kubernetes.io/projected/aa09a231-7a04-4851-87b4-0f7641094c88-kube-api-access-q4hzs\") on node \"crc\" DevicePath \"\"" Dec 04 01:03:20 crc kubenswrapper[4764]: I1204 01:03:20.480653 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2448efea006c01dfa7bda8750346dd53ea7f4531e836b80bf5fa2d0ef9aff9" Dec 04 01:03:20 crc kubenswrapper[4764]: I1204 01:03:20.480746 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 04 01:03:20 crc kubenswrapper[4764]: I1204 01:03:20.563034 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa09a231-7a04-4851-87b4-0f7641094c88" path="/var/lib/kubelet/pods/aa09a231-7a04-4851-87b4-0f7641094c88/volumes" Dec 04 01:04:50 crc kubenswrapper[4764]: I1204 01:04:50.868491 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:04:50 crc kubenswrapper[4764]: I1204 01:04:50.869081 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:05:20 crc kubenswrapper[4764]: I1204 01:05:20.869300 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:05:20 crc kubenswrapper[4764]: I1204 01:05:20.870285 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.669080 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:05:43 crc kubenswrapper[4764]: E1204 01:05:43.670145 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa09a231-7a04-4851-87b4-0f7641094c88" containerName="mariadb-client-2" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.670167 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa09a231-7a04-4851-87b4-0f7641094c88" containerName="mariadb-client-2" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.670422 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa09a231-7a04-4851-87b4-0f7641094c88" containerName="mariadb-client-2" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.672158 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.692857 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.764187 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcxg\" (UniqueName: \"kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.764265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.764309 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.865306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.865394 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.865927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcxg\" (UniqueName: \"kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.866340 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.866614 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:43 crc kubenswrapper[4764]: I1204 01:05:43.892980 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcxg\" (UniqueName: \"kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg\") pod \"redhat-operators-r8gqj\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:44 crc kubenswrapper[4764]: I1204 01:05:44.008690 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:44 crc kubenswrapper[4764]: I1204 01:05:44.509157 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:05:45 crc kubenswrapper[4764]: I1204 01:05:45.106889 4764 generic.go:334] "Generic (PLEG): container finished" podID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerID="09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992" exitCode=0 Dec 04 01:05:45 crc kubenswrapper[4764]: I1204 01:05:45.106946 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerDied","Data":"09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992"} Dec 04 01:05:45 crc kubenswrapper[4764]: I1204 01:05:45.107189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerStarted","Data":"75f8547beb766696bd09d54bc72cb3f431e61c7a397052a33c814557481339b2"} Dec 04 01:05:46 crc kubenswrapper[4764]: I1204 01:05:46.119323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerStarted","Data":"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4"} Dec 04 01:05:47 crc kubenswrapper[4764]: I1204 01:05:47.131396 4764 generic.go:334] "Generic (PLEG): container finished" podID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerID="3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4" exitCode=0 Dec 04 01:05:47 crc kubenswrapper[4764]: I1204 01:05:47.131456 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerDied","Data":"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4"} Dec 04 01:05:48 crc kubenswrapper[4764]: I1204 01:05:48.143365 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerStarted","Data":"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4"} Dec 04 01:05:48 crc kubenswrapper[4764]: I1204 01:05:48.174198 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r8gqj" podStartSLOduration=2.415301799 podStartE2EDuration="5.17418239s" podCreationTimestamp="2025-12-04 01:05:43 +0000 UTC" firstStartedPulling="2025-12-04 01:05:45.108959248 +0000 UTC m=+5080.870283699" lastFinishedPulling="2025-12-04 01:05:47.867839849 +0000 UTC m=+5083.629164290" observedRunningTime="2025-12-04 01:05:48.167248489 +0000 UTC m=+5083.928572910" watchObservedRunningTime="2025-12-04 01:05:48.17418239 +0000 UTC m=+5083.935506811" Dec 04 01:05:50 crc kubenswrapper[4764]: I1204 01:05:50.869568 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:05:50 crc kubenswrapper[4764]: I1204 01:05:50.870177 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:05:50 crc kubenswrapper[4764]: I1204 01:05:50.870259 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:05:50 crc kubenswrapper[4764]: I1204 01:05:50.871409 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:05:50 crc kubenswrapper[4764]: I1204 01:05:50.871538 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147" gracePeriod=600 Dec 04 01:05:51 crc kubenswrapper[4764]: I1204 01:05:51.174890 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147" exitCode=0 Dec 04 01:05:51 crc kubenswrapper[4764]: I1204 01:05:51.174957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147"} Dec 04 01:05:51 crc kubenswrapper[4764]: I1204 01:05:51.175010 4764 scope.go:117] "RemoveContainer" containerID="46a086f9fb2fe439d11d5381c08d7dd61be8a9d011c88534452dd09f8ff110c2" Dec 04 01:05:52 crc kubenswrapper[4764]: I1204 01:05:52.186892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d"} Dec 04 01:05:54 crc kubenswrapper[4764]: I1204 01:05:54.009563 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:54 crc kubenswrapper[4764]: I1204 01:05:54.010152 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:05:55 crc kubenswrapper[4764]: I1204 01:05:55.087999 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r8gqj" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="registry-server" probeResult="failure" output=< Dec 04 01:05:55 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:05:55 crc kubenswrapper[4764]: > Dec 04 01:06:04 crc kubenswrapper[4764]: I1204 01:06:04.079167 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:06:04 crc kubenswrapper[4764]: I1204 01:06:04.163119 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:06:04 crc kubenswrapper[4764]: I1204 01:06:04.334575 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.357973 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r8gqj" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="registry-server" containerID="cri-o://934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4" gracePeriod=2 Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.842265 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.876083 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content\") pod \"1519c57e-5ceb-4d05-b4bd-4d015f609372\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.876216 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities\") pod \"1519c57e-5ceb-4d05-b4bd-4d015f609372\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.876345 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lcxg\" (UniqueName: \"kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg\") pod \"1519c57e-5ceb-4d05-b4bd-4d015f609372\" (UID: \"1519c57e-5ceb-4d05-b4bd-4d015f609372\") " Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.877188 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities" (OuterVolumeSpecName: "utilities") pod "1519c57e-5ceb-4d05-b4bd-4d015f609372" (UID: "1519c57e-5ceb-4d05-b4bd-4d015f609372"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.883937 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg" (OuterVolumeSpecName: "kube-api-access-6lcxg") pod "1519c57e-5ceb-4d05-b4bd-4d015f609372" (UID: "1519c57e-5ceb-4d05-b4bd-4d015f609372"). InnerVolumeSpecName "kube-api-access-6lcxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.978553 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:05 crc kubenswrapper[4764]: I1204 01:06:05.978681 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lcxg\" (UniqueName: \"kubernetes.io/projected/1519c57e-5ceb-4d05-b4bd-4d015f609372-kube-api-access-6lcxg\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.053627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1519c57e-5ceb-4d05-b4bd-4d015f609372" (UID: "1519c57e-5ceb-4d05-b4bd-4d015f609372"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.080414 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1519c57e-5ceb-4d05-b4bd-4d015f609372-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.373468 4764 generic.go:334] "Generic (PLEG): container finished" podID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerID="934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4" exitCode=0 Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.373559 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r8gqj" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.373552 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerDied","Data":"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4"} Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.374152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r8gqj" event={"ID":"1519c57e-5ceb-4d05-b4bd-4d015f609372","Type":"ContainerDied","Data":"75f8547beb766696bd09d54bc72cb3f431e61c7a397052a33c814557481339b2"} Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.374197 4764 scope.go:117] "RemoveContainer" containerID="934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.410071 4764 scope.go:117] "RemoveContainer" containerID="3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.442970 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.455667 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r8gqj"] Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.459381 4764 scope.go:117] "RemoveContainer" containerID="09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.491899 4764 scope.go:117] "RemoveContainer" containerID="934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4" Dec 04 01:06:06 crc kubenswrapper[4764]: E1204 01:06:06.492511 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4\": container with ID starting with 934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4 not found: ID does not exist" containerID="934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.492570 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4"} err="failed to get container status \"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4\": rpc error: code = NotFound desc = could not find container \"934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4\": container with ID starting with 934b89ce6efa2e50fab164443adc0e40fcfe51fbfd51bb510e208a26b799dce4 not found: ID does not exist" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.492609 4764 scope.go:117] "RemoveContainer" containerID="3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4" Dec 04 01:06:06 crc kubenswrapper[4764]: E1204 01:06:06.493232 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4\": container with ID starting with 3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4 not found: ID does not exist" containerID="3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.493272 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4"} err="failed to get container status \"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4\": rpc error: code = NotFound desc = could not find container \"3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4\": container with ID starting with 3aea7f6a162ca964d7102155907ce61aead74de2a719d1487dd029ab4c6f63b4 not found: ID does not exist" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.493300 4764 scope.go:117] "RemoveContainer" containerID="09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992" Dec 04 01:06:06 crc kubenswrapper[4764]: E1204 01:06:06.493903 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992\": container with ID starting with 09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992 not found: ID does not exist" containerID="09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.493990 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992"} err="failed to get container status \"09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992\": rpc error: code = NotFound desc = could not find container \"09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992\": container with ID starting with 09b151f1682aa1f0249509b9164483a57fde607ef2f51f800f81b91317138992 not found: ID does not exist" Dec 04 01:06:06 crc kubenswrapper[4764]: I1204 01:06:06.563906 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" path="/var/lib/kubelet/pods/1519c57e-5ceb-4d05-b4bd-4d015f609372/volumes" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.967785 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:24 crc kubenswrapper[4764]: E1204 01:06:24.969747 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="extract-content" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.969909 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="extract-content" Dec 04 01:06:24 crc kubenswrapper[4764]: E1204 01:06:24.969999 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="extract-utilities" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.970079 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="extract-utilities" Dec 04 01:06:24 crc kubenswrapper[4764]: E1204 01:06:24.970178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="registry-server" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.970251 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="registry-server" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.970513 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1519c57e-5ceb-4d05-b4bd-4d015f609372" containerName="registry-server" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.972605 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:24 crc kubenswrapper[4764]: I1204 01:06:24.981623 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.088963 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q8z8\" (UniqueName: \"kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.089124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.089160 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.190501 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q8z8\" (UniqueName: \"kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.190635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.190677 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.191160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.191286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.215927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q8z8\" (UniqueName: \"kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8\") pod \"redhat-marketplace-xdhrp\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.309944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:25 crc kubenswrapper[4764]: I1204 01:06:25.792599 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:26 crc kubenswrapper[4764]: I1204 01:06:26.575811 4764 generic.go:334] "Generic (PLEG): container finished" podID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerID="fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff" exitCode=0 Dec 04 01:06:26 crc kubenswrapper[4764]: I1204 01:06:26.575929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerDied","Data":"fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff"} Dec 04 01:06:26 crc kubenswrapper[4764]: I1204 01:06:26.576604 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerStarted","Data":"2421e8aa8e8c2240ea4b34f1fddb671dd04b798bc5aa617e87541c191d1a6e32"} Dec 04 01:06:27 crc kubenswrapper[4764]: I1204 01:06:27.586534 4764 generic.go:334] "Generic (PLEG): container finished" podID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerID="7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0" exitCode=0 Dec 04 01:06:27 crc kubenswrapper[4764]: I1204 01:06:27.586876 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerDied","Data":"7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0"} Dec 04 01:06:28 crc kubenswrapper[4764]: I1204 01:06:28.596225 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerStarted","Data":"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd"} Dec 04 01:06:28 crc kubenswrapper[4764]: I1204 01:06:28.625139 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xdhrp" podStartSLOduration=3.223546709 podStartE2EDuration="4.625120115s" podCreationTimestamp="2025-12-04 01:06:24 +0000 UTC" firstStartedPulling="2025-12-04 01:06:26.580702383 +0000 UTC m=+5122.342026804" lastFinishedPulling="2025-12-04 01:06:27.982275789 +0000 UTC m=+5123.743600210" observedRunningTime="2025-12-04 01:06:28.618523242 +0000 UTC m=+5124.379847683" watchObservedRunningTime="2025-12-04 01:06:28.625120115 +0000 UTC m=+5124.386444556" Dec 04 01:06:35 crc kubenswrapper[4764]: I1204 01:06:35.311108 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:35 crc kubenswrapper[4764]: I1204 01:06:35.311860 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:35 crc kubenswrapper[4764]: I1204 01:06:35.393886 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:35 crc kubenswrapper[4764]: I1204 01:06:35.700604 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:35 crc kubenswrapper[4764]: I1204 01:06:35.744492 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:37 crc kubenswrapper[4764]: I1204 01:06:37.696187 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xdhrp" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="registry-server" containerID="cri-o://12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd" gracePeriod=2 Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.143455 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.211326 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content\") pod \"27de57e8-59c9-4a18-bfa4-94e738909a04\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.211482 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q8z8\" (UniqueName: \"kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8\") pod \"27de57e8-59c9-4a18-bfa4-94e738909a04\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.211670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities\") pod \"27de57e8-59c9-4a18-bfa4-94e738909a04\" (UID: \"27de57e8-59c9-4a18-bfa4-94e738909a04\") " Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.213128 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities" (OuterVolumeSpecName: "utilities") pod "27de57e8-59c9-4a18-bfa4-94e738909a04" (UID: "27de57e8-59c9-4a18-bfa4-94e738909a04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.220809 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8" (OuterVolumeSpecName: "kube-api-access-8q8z8") pod "27de57e8-59c9-4a18-bfa4-94e738909a04" (UID: "27de57e8-59c9-4a18-bfa4-94e738909a04"). InnerVolumeSpecName "kube-api-access-8q8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.246220 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27de57e8-59c9-4a18-bfa4-94e738909a04" (UID: "27de57e8-59c9-4a18-bfa4-94e738909a04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.314482 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.314536 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27de57e8-59c9-4a18-bfa4-94e738909a04-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.314562 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q8z8\" (UniqueName: \"kubernetes.io/projected/27de57e8-59c9-4a18-bfa4-94e738909a04-kube-api-access-8q8z8\") on node \"crc\" DevicePath \"\"" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.713709 4764 generic.go:334] "Generic (PLEG): container finished" podID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerID="12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd" exitCode=0 Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.713809 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xdhrp" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.713846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerDied","Data":"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd"} Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.714300 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xdhrp" event={"ID":"27de57e8-59c9-4a18-bfa4-94e738909a04","Type":"ContainerDied","Data":"2421e8aa8e8c2240ea4b34f1fddb671dd04b798bc5aa617e87541c191d1a6e32"} Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.714338 4764 scope.go:117] "RemoveContainer" containerID="12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.762038 4764 scope.go:117] "RemoveContainer" containerID="7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.771616 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.783800 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xdhrp"] Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.792844 4764 scope.go:117] "RemoveContainer" containerID="fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.845668 4764 scope.go:117] "RemoveContainer" containerID="12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd" Dec 04 01:06:38 crc kubenswrapper[4764]: E1204 01:06:38.846304 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd\": container with ID starting with 12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd not found: ID does not exist" containerID="12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.846400 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd"} err="failed to get container status \"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd\": rpc error: code = NotFound desc = could not find container \"12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd\": container with ID starting with 12ff08a65acd8b4ab19c500a24b6024a4b6453038a9fab458d57b14ce24b8abd not found: ID does not exist" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.846458 4764 scope.go:117] "RemoveContainer" containerID="7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0" Dec 04 01:06:38 crc kubenswrapper[4764]: E1204 01:06:38.847089 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0\": container with ID starting with 7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0 not found: ID does not exist" containerID="7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.847141 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0"} err="failed to get container status \"7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0\": rpc error: code = NotFound desc = could not find container \"7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0\": container with ID starting with 7bf46a94d44e212f180d88f32f47604b7d5512a3f24965413842e428a036d7a0 not found: ID does not exist" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.847172 4764 scope.go:117] "RemoveContainer" containerID="fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff" Dec 04 01:06:38 crc kubenswrapper[4764]: E1204 01:06:38.847557 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff\": container with ID starting with fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff not found: ID does not exist" containerID="fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff" Dec 04 01:06:38 crc kubenswrapper[4764]: I1204 01:06:38.847613 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff"} err="failed to get container status \"fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff\": rpc error: code = NotFound desc = could not find container \"fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff\": container with ID starting with fe8c1148ad025d3d2072826697036ca909f7769accdee07d2122064e50024aff not found: ID does not exist" Dec 04 01:06:40 crc kubenswrapper[4764]: I1204 01:06:40.563438 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" path="/var/lib/kubelet/pods/27de57e8-59c9-4a18-bfa4-94e738909a04/volumes" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.077001 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:06:56 crc kubenswrapper[4764]: E1204 01:06:56.078003 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="registry-server" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.078020 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="registry-server" Dec 04 01:06:56 crc kubenswrapper[4764]: E1204 01:06:56.078041 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="extract-content" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.078050 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="extract-content" Dec 04 01:06:56 crc kubenswrapper[4764]: E1204 01:06:56.078069 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="extract-utilities" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.078079 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="extract-utilities" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.078267 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27de57e8-59c9-4a18-bfa4-94e738909a04" containerName="registry-server" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.079932 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.107219 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.235684 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7czv9\" (UniqueName: \"kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.235783 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.235985 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.337170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.337431 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.337618 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7czv9\" (UniqueName: \"kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.337778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.337905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.368427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7czv9\" (UniqueName: \"kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9\") pod \"community-operators-6p4sj\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.408151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:06:56 crc kubenswrapper[4764]: I1204 01:06:56.957427 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:06:56 crc kubenswrapper[4764]: W1204 01:06:56.978093 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf301965a_6425_43a6_85ce_7a4fdbc6a88f.slice/crio-079c5f79e5063038850a78e7bc9eaa7ac8dc10a106ebcf1bdec3cf819597acb2 WatchSource:0}: Error finding container 079c5f79e5063038850a78e7bc9eaa7ac8dc10a106ebcf1bdec3cf819597acb2: Status 404 returned error can't find the container with id 079c5f79e5063038850a78e7bc9eaa7ac8dc10a106ebcf1bdec3cf819597acb2 Dec 04 01:06:57 crc kubenswrapper[4764]: I1204 01:06:57.920153 4764 generic.go:334] "Generic (PLEG): container finished" podID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerID="6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384" exitCode=0 Dec 04 01:06:57 crc kubenswrapper[4764]: I1204 01:06:57.920205 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerDied","Data":"6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384"} Dec 04 01:06:57 crc kubenswrapper[4764]: I1204 01:06:57.920244 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerStarted","Data":"079c5f79e5063038850a78e7bc9eaa7ac8dc10a106ebcf1bdec3cf819597acb2"} Dec 04 01:06:58 crc kubenswrapper[4764]: I1204 01:06:58.932380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerStarted","Data":"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff"} Dec 04 01:06:59 crc kubenswrapper[4764]: I1204 01:06:59.944814 4764 generic.go:334] "Generic (PLEG): container finished" podID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerID="43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff" exitCode=0 Dec 04 01:06:59 crc kubenswrapper[4764]: I1204 01:06:59.944911 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerDied","Data":"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff"} Dec 04 01:07:00 crc kubenswrapper[4764]: I1204 01:07:00.955054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerStarted","Data":"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf"} Dec 04 01:07:00 crc kubenswrapper[4764]: I1204 01:07:00.971136 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6p4sj" podStartSLOduration=2.529264794 podStartE2EDuration="4.971115701s" podCreationTimestamp="2025-12-04 01:06:56 +0000 UTC" firstStartedPulling="2025-12-04 01:06:57.923162764 +0000 UTC m=+5153.684487215" lastFinishedPulling="2025-12-04 01:07:00.365013701 +0000 UTC m=+5156.126338122" observedRunningTime="2025-12-04 01:07:00.969992883 +0000 UTC m=+5156.731317304" watchObservedRunningTime="2025-12-04 01:07:00.971115701 +0000 UTC m=+5156.732440112" Dec 04 01:07:06 crc kubenswrapper[4764]: I1204 01:07:06.408566 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:06 crc kubenswrapper[4764]: I1204 01:07:06.409009 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:06 crc kubenswrapper[4764]: I1204 01:07:06.475523 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:07 crc kubenswrapper[4764]: I1204 01:07:07.063680 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:07 crc kubenswrapper[4764]: I1204 01:07:07.127492 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.029837 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6p4sj" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="registry-server" containerID="cri-o://3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf" gracePeriod=2 Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.505407 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.666577 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7czv9\" (UniqueName: \"kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9\") pod \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.666683 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content\") pod \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.666744 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities\") pod \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\" (UID: \"f301965a-6425-43a6-85ce-7a4fdbc6a88f\") " Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.667840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities" (OuterVolumeSpecName: "utilities") pod "f301965a-6425-43a6-85ce-7a4fdbc6a88f" (UID: "f301965a-6425-43a6-85ce-7a4fdbc6a88f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.680329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9" (OuterVolumeSpecName: "kube-api-access-7czv9") pod "f301965a-6425-43a6-85ce-7a4fdbc6a88f" (UID: "f301965a-6425-43a6-85ce-7a4fdbc6a88f"). InnerVolumeSpecName "kube-api-access-7czv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.719828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f301965a-6425-43a6-85ce-7a4fdbc6a88f" (UID: "f301965a-6425-43a6-85ce-7a4fdbc6a88f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.768531 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7czv9\" (UniqueName: \"kubernetes.io/projected/f301965a-6425-43a6-85ce-7a4fdbc6a88f-kube-api-access-7czv9\") on node \"crc\" DevicePath \"\"" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.768580 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:07:09 crc kubenswrapper[4764]: I1204 01:07:09.768593 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f301965a-6425-43a6-85ce-7a4fdbc6a88f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.054016 4764 generic.go:334] "Generic (PLEG): container finished" podID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerID="3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf" exitCode=0 Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.054077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerDied","Data":"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf"} Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.054152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6p4sj" event={"ID":"f301965a-6425-43a6-85ce-7a4fdbc6a88f","Type":"ContainerDied","Data":"079c5f79e5063038850a78e7bc9eaa7ac8dc10a106ebcf1bdec3cf819597acb2"} Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.054182 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6p4sj" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.054253 4764 scope.go:117] "RemoveContainer" containerID="3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.080013 4764 scope.go:117] "RemoveContainer" containerID="43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.095577 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.109222 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6p4sj"] Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.120615 4764 scope.go:117] "RemoveContainer" containerID="6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.144369 4764 scope.go:117] "RemoveContainer" containerID="3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf" Dec 04 01:07:10 crc kubenswrapper[4764]: E1204 01:07:10.144916 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf\": container with ID starting with 3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf not found: ID does not exist" containerID="3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.144965 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf"} err="failed to get container status \"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf\": rpc error: code = NotFound desc = could not find container \"3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf\": container with ID starting with 3165ca25b380e6a2a72db91249b3926933706a938f4702a1df3ba08bdaececdf not found: ID does not exist" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.144997 4764 scope.go:117] "RemoveContainer" containerID="43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff" Dec 04 01:07:10 crc kubenswrapper[4764]: E1204 01:07:10.145327 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff\": container with ID starting with 43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff not found: ID does not exist" containerID="43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.145365 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff"} err="failed to get container status \"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff\": rpc error: code = NotFound desc = could not find container \"43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff\": container with ID starting with 43b72867af8e3f7f55ff14407fa3a9ad934a5583095307641cbfd8fba6a702ff not found: ID does not exist" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.145396 4764 scope.go:117] "RemoveContainer" containerID="6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384" Dec 04 01:07:10 crc kubenswrapper[4764]: E1204 01:07:10.145645 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384\": container with ID starting with 6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384 not found: ID does not exist" containerID="6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.145676 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384"} err="failed to get container status \"6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384\": rpc error: code = NotFound desc = could not find container \"6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384\": container with ID starting with 6acd53679f9a2366842a47672482936b57eb7aad4d91bf867c685e193f8b0384 not found: ID does not exist" Dec 04 01:07:10 crc kubenswrapper[4764]: I1204 01:07:10.560169 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" path="/var/lib/kubelet/pods/f301965a-6425-43a6-85ce-7a4fdbc6a88f/volumes" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.022850 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 01:07:26 crc kubenswrapper[4764]: E1204 01:07:26.024397 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="extract-content" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.024431 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="extract-content" Dec 04 01:07:26 crc kubenswrapper[4764]: E1204 01:07:26.024459 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="registry-server" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.024475 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="registry-server" Dec 04 01:07:26 crc kubenswrapper[4764]: E1204 01:07:26.024543 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="extract-utilities" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.024565 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="extract-utilities" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.024949 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f301965a-6425-43a6-85ce-7a4fdbc6a88f" containerName="registry-server" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.026105 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.030599 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-jvh67" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.048677 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.154622 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.155232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7fdv\" (UniqueName: \"kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.256769 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.257003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7fdv\" (UniqueName: \"kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.260980 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.261051 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9cfd79929a7db36ba0a18a707b7c7f215198c9ce9fa0f0d344b7d7a15b4bf5fd/globalmount\"" pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.293666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7fdv\" (UniqueName: \"kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.313252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") pod \"mariadb-copy-data\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.362489 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 04 01:07:26 crc kubenswrapper[4764]: I1204 01:07:26.958824 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 01:07:27 crc kubenswrapper[4764]: I1204 01:07:27.222000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e67613d0-4acc-4685-be57-a7a4f9d44a08","Type":"ContainerStarted","Data":"0df504eadc1d838dee7c9c7ea4f5bed08b7ad14bb3b1e3052b82054f986a0a8b"} Dec 04 01:07:27 crc kubenswrapper[4764]: I1204 01:07:27.222321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e67613d0-4acc-4685-be57-a7a4f9d44a08","Type":"ContainerStarted","Data":"d6ed7f4aa744de1c973ac117a7f929840cf5b00260cb1554f8fb31e4be8ef8bd"} Dec 04 01:07:27 crc kubenswrapper[4764]: I1204 01:07:27.243670 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.243645056 podStartE2EDuration="3.243645056s" podCreationTimestamp="2025-12-04 01:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:07:27.242201591 +0000 UTC m=+5183.003526022" watchObservedRunningTime="2025-12-04 01:07:27.243645056 +0000 UTC m=+5183.004969487" Dec 04 01:07:29 crc kubenswrapper[4764]: I1204 01:07:29.959679 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:29 crc kubenswrapper[4764]: I1204 01:07:29.961439 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:29 crc kubenswrapper[4764]: I1204 01:07:29.990652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:30 crc kubenswrapper[4764]: I1204 01:07:30.129868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj428\" (UniqueName: \"kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428\") pod \"mariadb-client\" (UID: \"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99\") " pod="openstack/mariadb-client" Dec 04 01:07:30 crc kubenswrapper[4764]: I1204 01:07:30.234784 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj428\" (UniqueName: \"kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428\") pod \"mariadb-client\" (UID: \"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99\") " pod="openstack/mariadb-client" Dec 04 01:07:30 crc kubenswrapper[4764]: I1204 01:07:30.259772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj428\" (UniqueName: \"kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428\") pod \"mariadb-client\" (UID: \"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99\") " pod="openstack/mariadb-client" Dec 04 01:07:30 crc kubenswrapper[4764]: I1204 01:07:30.291062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:30 crc kubenswrapper[4764]: W1204 01:07:30.560166 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e3d4d6_fe4c_4a8a_b27e_e4dcf7f6dc99.slice/crio-19f7482e769b8c6f7a1398c80e84aa6b692b53109bb524b9006eac0c52a76b95 WatchSource:0}: Error finding container 19f7482e769b8c6f7a1398c80e84aa6b692b53109bb524b9006eac0c52a76b95: Status 404 returned error can't find the container with id 19f7482e769b8c6f7a1398c80e84aa6b692b53109bb524b9006eac0c52a76b95 Dec 04 01:07:30 crc kubenswrapper[4764]: I1204 01:07:30.565745 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:31 crc kubenswrapper[4764]: I1204 01:07:31.274328 4764 generic.go:334] "Generic (PLEG): container finished" podID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" containerID="b03d1bc808bf3cfeb33c2dcb642e3fb25e640c57fef08dec687040dd1b7dee16" exitCode=0 Dec 04 01:07:31 crc kubenswrapper[4764]: I1204 01:07:31.274406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99","Type":"ContainerDied","Data":"b03d1bc808bf3cfeb33c2dcb642e3fb25e640c57fef08dec687040dd1b7dee16"} Dec 04 01:07:31 crc kubenswrapper[4764]: I1204 01:07:31.276368 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99","Type":"ContainerStarted","Data":"19f7482e769b8c6f7a1398c80e84aa6b692b53109bb524b9006eac0c52a76b95"} Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.685515 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.710245 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99/mariadb-client/0.log" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.733870 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.751824 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.779557 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj428\" (UniqueName: \"kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428\") pod \"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99\" (UID: \"b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99\") " Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.788549 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428" (OuterVolumeSpecName: "kube-api-access-zj428") pod "b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" (UID: "b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99"). InnerVolumeSpecName "kube-api-access-zj428". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.864297 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:32 crc kubenswrapper[4764]: E1204 01:07:32.864660 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" containerName="mariadb-client" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.864682 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" containerName="mariadb-client" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.865029 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" containerName="mariadb-client" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.866412 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.873430 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.882575 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj428\" (UniqueName: \"kubernetes.io/projected/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99-kube-api-access-zj428\") on node \"crc\" DevicePath \"\"" Dec 04 01:07:32 crc kubenswrapper[4764]: I1204 01:07:32.983867 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmfk\" (UniqueName: \"kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk\") pod \"mariadb-client\" (UID: \"b23fae89-5e23-47e0-abec-c53561976d33\") " pod="openstack/mariadb-client" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.086149 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmfk\" (UniqueName: \"kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk\") pod \"mariadb-client\" (UID: \"b23fae89-5e23-47e0-abec-c53561976d33\") " pod="openstack/mariadb-client" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.108928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmfk\" (UniqueName: \"kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk\") pod \"mariadb-client\" (UID: \"b23fae89-5e23-47e0-abec-c53561976d33\") " pod="openstack/mariadb-client" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.186629 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.318506 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f7482e769b8c6f7a1398c80e84aa6b692b53109bb524b9006eac0c52a76b95" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.318566 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.342687 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" podUID="b23fae89-5e23-47e0-abec-c53561976d33" Dec 04 01:07:33 crc kubenswrapper[4764]: I1204 01:07:33.470407 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:33 crc kubenswrapper[4764]: W1204 01:07:33.479008 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb23fae89_5e23_47e0_abec_c53561976d33.slice/crio-6e7093f1ccb0064b88d2a8a3edb48cf7465e3931aff2e0dd53378143ae173d49 WatchSource:0}: Error finding container 6e7093f1ccb0064b88d2a8a3edb48cf7465e3931aff2e0dd53378143ae173d49: Status 404 returned error can't find the container with id 6e7093f1ccb0064b88d2a8a3edb48cf7465e3931aff2e0dd53378143ae173d49 Dec 04 01:07:34 crc kubenswrapper[4764]: I1204 01:07:34.333604 4764 generic.go:334] "Generic (PLEG): container finished" podID="b23fae89-5e23-47e0-abec-c53561976d33" containerID="9bb5a845ff8875f101d463d2ec1eb9d7451f22c737c763b643421531dfba0ddd" exitCode=0 Dec 04 01:07:34 crc kubenswrapper[4764]: I1204 01:07:34.333802 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b23fae89-5e23-47e0-abec-c53561976d33","Type":"ContainerDied","Data":"9bb5a845ff8875f101d463d2ec1eb9d7451f22c737c763b643421531dfba0ddd"} Dec 04 01:07:34 crc kubenswrapper[4764]: I1204 01:07:34.334038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b23fae89-5e23-47e0-abec-c53561976d33","Type":"ContainerStarted","Data":"6e7093f1ccb0064b88d2a8a3edb48cf7465e3931aff2e0dd53378143ae173d49"} Dec 04 01:07:34 crc kubenswrapper[4764]: I1204 01:07:34.562313 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99" path="/var/lib/kubelet/pods/b5e3d4d6-fe4c-4a8a-b27e-e4dcf7f6dc99/volumes" Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.671215 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.687668 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_b23fae89-5e23-47e0-abec-c53561976d33/mariadb-client/0.log" Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.721815 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.731518 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.828923 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zmfk\" (UniqueName: \"kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk\") pod \"b23fae89-5e23-47e0-abec-c53561976d33\" (UID: \"b23fae89-5e23-47e0-abec-c53561976d33\") " Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.834043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk" (OuterVolumeSpecName: "kube-api-access-7zmfk") pod "b23fae89-5e23-47e0-abec-c53561976d33" (UID: "b23fae89-5e23-47e0-abec-c53561976d33"). InnerVolumeSpecName "kube-api-access-7zmfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:07:35 crc kubenswrapper[4764]: I1204 01:07:35.930921 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zmfk\" (UniqueName: \"kubernetes.io/projected/b23fae89-5e23-47e0-abec-c53561976d33-kube-api-access-7zmfk\") on node \"crc\" DevicePath \"\"" Dec 04 01:07:36 crc kubenswrapper[4764]: I1204 01:07:36.351959 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e7093f1ccb0064b88d2a8a3edb48cf7465e3931aff2e0dd53378143ae173d49" Dec 04 01:07:36 crc kubenswrapper[4764]: I1204 01:07:36.352029 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 04 01:07:36 crc kubenswrapper[4764]: I1204 01:07:36.572084 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b23fae89-5e23-47e0-abec-c53561976d33" path="/var/lib/kubelet/pods/b23fae89-5e23-47e0-abec-c53561976d33/volumes" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.735523 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 01:08:06 crc kubenswrapper[4764]: E1204 01:08:06.738171 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b23fae89-5e23-47e0-abec-c53561976d33" containerName="mariadb-client" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.738333 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b23fae89-5e23-47e0-abec-c53561976d33" containerName="mariadb-client" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.738843 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b23fae89-5e23-47e0-abec-c53561976d33" containerName="mariadb-client" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.740479 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.743476 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-vbmjb" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.743522 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.758912 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.768235 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.769691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.779364 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.783237 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.787805 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.802791 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.818837 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-config\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.818918 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.818970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.819074 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fbeb7-559b-43f0-9c44-6462f6db2e42-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.819120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f95fbeb7-559b-43f0-9c44-6462f6db2e42-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.819189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5j8\" (UniqueName: \"kubernetes.io/projected/f95fbeb7-559b-43f0-9c44-6462f6db2e42-kube-api-access-np5j8\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.828873 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mfv4\" (UniqueName: \"kubernetes.io/projected/7f86b426-5476-4395-a2f4-7b9ba3dead52-kube-api-access-2mfv4\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920375 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkzlx\" (UniqueName: \"kubernetes.io/projected/211073e0-a5c0-49b8-a4e2-524448c0f91c-kube-api-access-gkzlx\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fbeb7-559b-43f0-9c44-6462f6db2e42-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920478 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f95fbeb7-559b-43f0-9c44-6462f6db2e42-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920496 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211073e0-a5c0-49b8-a4e2-524448c0f91c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5j8\" (UniqueName: \"kubernetes.io/projected/f95fbeb7-559b-43f0-9c44-6462f6db2e42-kube-api-access-np5j8\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920537 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f86b426-5476-4395-a2f4-7b9ba3dead52-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-config\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920633 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/211073e0-a5c0-49b8-a4e2-524448c0f91c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920647 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-config\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920662 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-config\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920677 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920753 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86b426-5476-4395-a2f4-7b9ba3dead52-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.920772 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.922906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-config\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.922906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f95fbeb7-559b-43f0-9c44-6462f6db2e42-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.923290 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.923727 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f95fbeb7-559b-43f0-9c44-6462f6db2e42-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.934266 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.934840 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.934878 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/158c3ed4aafc6f8ce3bb254b676d7fbcfaa95202d925d4a5fbe2535e55961d99/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.935880 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-nrrmw" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.936424 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f95fbeb7-559b-43f0-9c44-6462f6db2e42-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.936628 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.937923 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.948511 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.951286 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5j8\" (UniqueName: \"kubernetes.io/projected/f95fbeb7-559b-43f0-9c44-6462f6db2e42-kube-api-access-np5j8\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.961861 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.963617 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.969601 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.971287 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.981749 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.987970 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e0a5b8d7-beb5-451e-a0f9-c39a0b38c71a\") pod \"ovsdbserver-nb-0\" (UID: \"f95fbeb7-559b-43f0-9c44-6462f6db2e42\") " pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:06 crc kubenswrapper[4764]: I1204 01:08:06.997562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfgw6\" (UniqueName: \"kubernetes.io/projected/b37142e0-8801-4951-ab52-0a6a553b23ff-kube-api-access-zfgw6\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021568 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021597 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211073e0-a5c0-49b8-a4e2-524448c0f91c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021624 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f86b426-5476-4395-a2f4-7b9ba3dead52-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021704 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-config\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37142e0-8801-4951-ab52-0a6a553b23ff-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021782 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021804 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021830 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021865 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021892 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssqlt\" (UniqueName: \"kubernetes.io/projected/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-kube-api-access-ssqlt\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021913 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.021985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-config\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022038 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/211073e0-a5c0-49b8-a4e2-524448c0f91c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-config\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022079 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022109 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-config\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86b426-5476-4395-a2f4-7b9ba3dead52-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022156 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b37142e0-8801-4951-ab52-0a6a553b23ff-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfqz\" (UniqueName: \"kubernetes.io/projected/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-kube-api-access-ddfqz\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mfv4\" (UniqueName: \"kubernetes.io/projected/7f86b426-5476-4395-a2f4-7b9ba3dead52-kube-api-access-2mfv4\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022205 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022226 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkzlx\" (UniqueName: \"kubernetes.io/projected/211073e0-a5c0-49b8-a4e2-524448c0f91c-kube-api-access-gkzlx\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022244 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022271 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.022871 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/211073e0-a5c0-49b8-a4e2-524448c0f91c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.023598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.024410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/211073e0-a5c0-49b8-a4e2-524448c0f91c-config\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.024545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7f86b426-5476-4395-a2f4-7b9ba3dead52-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.025933 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.025966 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9f61cc6a754824f239a48b3f66fb39710347ba330e65613aca78f3d69349fc2a/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.026211 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.026243 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c2c63908e1ead6f2eccbcdcfdfdb26ce4c4f6fbe3809ba3128ad3762ef56c613/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.026619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.028382 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/211073e0-a5c0-49b8-a4e2-524448c0f91c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.029091 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f86b426-5476-4395-a2f4-7b9ba3dead52-config\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.031458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f86b426-5476-4395-a2f4-7b9ba3dead52-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.040595 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkzlx\" (UniqueName: \"kubernetes.io/projected/211073e0-a5c0-49b8-a4e2-524448c0f91c-kube-api-access-gkzlx\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.043167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mfv4\" (UniqueName: \"kubernetes.io/projected/7f86b426-5476-4395-a2f4-7b9ba3dead52-kube-api-access-2mfv4\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.049388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-58015e13-cc5a-4fa0-a729-1d9903f7a6dc\") pod \"ovsdbserver-nb-1\" (UID: \"7f86b426-5476-4395-a2f4-7b9ba3dead52\") " pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.051548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ca52c7d8-3e34-4c39-ac44-c391a9881d47\") pod \"ovsdbserver-nb-2\" (UID: \"211073e0-a5c0-49b8-a4e2-524448c0f91c\") " pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.066098 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.096291 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.109553 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfqz\" (UniqueName: \"kubernetes.io/projected/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-kube-api-access-ddfqz\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124876 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b37142e0-8801-4951-ab52-0a6a553b23ff-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124900 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.124986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgw6\" (UniqueName: \"kubernetes.io/projected/b37142e0-8801-4951-ab52-0a6a553b23ff-kube-api-access-zfgw6\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125003 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-config\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37142e0-8801-4951-ab52-0a6a553b23ff-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125125 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125143 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125190 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssqlt\" (UniqueName: \"kubernetes.io/projected/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-kube-api-access-ssqlt\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125208 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125230 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125260 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.125307 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-config\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.126036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b37142e0-8801-4951-ab52-0a6a553b23ff-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.126827 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.126905 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-config\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.127135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.127248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.127520 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-config\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.127887 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.129047 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-config\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.134202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.134327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b37142e0-8801-4951-ab52-0a6a553b23ff-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.138665 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b37142e0-8801-4951-ab52-0a6a553b23ff-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.141569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.141963 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.141997 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6649d4375412ba3e9b6f118f84c273ce20bf6569be5745ef52b97bed1bebb1c5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.142330 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.142366 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d32ff87c30ecb0fec9c267471f97db2132d9539fd132929daa69262d3856e64b/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.146024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssqlt\" (UniqueName: \"kubernetes.io/projected/0189e999-3d6d-43b6-9c0c-2afa24e5b6c0-kube-api-access-ssqlt\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.146851 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfqz\" (UniqueName: \"kubernetes.io/projected/1a9830a9-8b4b-443e-bca0-4b70d281dfb9-kube-api-access-ddfqz\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.147082 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.147116 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a7ffbf239c7bffc8dadd4433271d511a01cccab84766ab52e239dff04e4e3fe6/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.150141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgw6\" (UniqueName: \"kubernetes.io/projected/b37142e0-8801-4951-ab52-0a6a553b23ff-kube-api-access-zfgw6\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.194098 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e62fa4a3-06c0-4049-9c61-717e6a20e9ba\") pod \"ovsdbserver-sb-2\" (UID: \"b37142e0-8801-4951-ab52-0a6a553b23ff\") " pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.202877 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-935f8f37-f1bd-4464-b0d0-f6e3a2881127\") pod \"ovsdbserver-sb-1\" (UID: \"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0\") " pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.206799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72ead28e-084d-43e2-963e-0c5c23c6c043\") pod \"ovsdbserver-sb-0\" (UID: \"1a9830a9-8b4b-443e-bca0-4b70d281dfb9\") " pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.310799 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.320334 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.383243 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.722973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.957335 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f95fbeb7-559b-43f0-9c44-6462f6db2e42","Type":"ContainerStarted","Data":"c26b7cd8fd725c78988ffcce0ea6c9d24cc2e551ca65abafddb9c48a1fe47760"} Dec 04 01:08:07 crc kubenswrapper[4764]: I1204 01:08:07.957835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f95fbeb7-559b-43f0-9c44-6462f6db2e42","Type":"ContainerStarted","Data":"0b2b142cae5fad7f8f5ad58b2029aa000bf2c3cf518430269cd88df38c342dd0"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.006907 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 04 01:08:08 crc kubenswrapper[4764]: W1204 01:08:08.011483 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb37142e0_8801_4951_ab52_0a6a553b23ff.slice/crio-f5d86ef62f16ffc492f9a88bddba69338af40893062691996c737d4ffd1d13c2 WatchSource:0}: Error finding container f5d86ef62f16ffc492f9a88bddba69338af40893062691996c737d4ffd1d13c2: Status 404 returned error can't find the container with id f5d86ef62f16ffc492f9a88bddba69338af40893062691996c737d4ffd1d13c2 Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.079466 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 04 01:08:08 crc kubenswrapper[4764]: W1204 01:08:08.325348 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod211073e0_a5c0_49b8_a4e2_524448c0f91c.slice/crio-31ae8fe2b9c6cdd4d74d995601d334387126f388c410cee40f1da8454361247f WatchSource:0}: Error finding container 31ae8fe2b9c6cdd4d74d995601d334387126f388c410cee40f1da8454361247f: Status 404 returned error can't find the container with id 31ae8fe2b9c6cdd4d74d995601d334387126f388c410cee40f1da8454361247f Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.327132 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.845967 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 04 01:08:08 crc kubenswrapper[4764]: W1204 01:08:08.849382 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f86b426_5476_4395_a2f4_7b9ba3dead52.slice/crio-c9e5f60b01729692f36bfb0f367c2410bf29b4c8c1a0dd548fc47730390bade5 WatchSource:0}: Error finding container c9e5f60b01729692f36bfb0f367c2410bf29b4c8c1a0dd548fc47730390bade5: Status 404 returned error can't find the container with id c9e5f60b01729692f36bfb0f367c2410bf29b4c8c1a0dd548fc47730390bade5 Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.973266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f95fbeb7-559b-43f0-9c44-6462f6db2e42","Type":"ContainerStarted","Data":"e4044cf4d35b071ecfec2cbcf44f4dbb7cb97e67669d046e56e8264c4eab7934"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.980818 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b37142e0-8801-4951-ab52-0a6a553b23ff","Type":"ContainerStarted","Data":"b67a65c9ce03d3c40889e8bac8f9b8c8edacc992193f172d5e7c1178307a398c"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.980890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b37142e0-8801-4951-ab52-0a6a553b23ff","Type":"ContainerStarted","Data":"7b8830f14364491888bf3b8138f625e88932083929b5284de8746c43b4c67f67"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.980918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"b37142e0-8801-4951-ab52-0a6a553b23ff","Type":"ContainerStarted","Data":"f5d86ef62f16ffc492f9a88bddba69338af40893062691996c737d4ffd1d13c2"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.984254 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"211073e0-a5c0-49b8-a4e2-524448c0f91c","Type":"ContainerStarted","Data":"83aefba455606dc6d73759bfb463afc85322ef8ed29e39da153b6cb4572380fc"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.984306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"211073e0-a5c0-49b8-a4e2-524448c0f91c","Type":"ContainerStarted","Data":"7764a8eb8d0aa2137c086ddcfca725098d97174c0f258f2bcd0589e364293ef2"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.984325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"211073e0-a5c0-49b8-a4e2-524448c0f91c","Type":"ContainerStarted","Data":"31ae8fe2b9c6cdd4d74d995601d334387126f388c410cee40f1da8454361247f"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.987795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0","Type":"ContainerStarted","Data":"eee1cfca56bcbd1f4533d6f872678a622d7b10f3e20f43b80111c107ca56232a"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.987855 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0","Type":"ContainerStarted","Data":"727468a22741338d88c14f99bde0fa59b5ef55162ef7c4942b50e5ed7a769f81"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.987880 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"0189e999-3d6d-43b6-9c0c-2afa24e5b6c0","Type":"ContainerStarted","Data":"41128f2fffd75ca8ab56f436dacd95f59e010067a4acec054016346dabd4ecee"} Dec 04 01:08:08 crc kubenswrapper[4764]: I1204 01:08:08.990422 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7f86b426-5476-4395-a2f4-7b9ba3dead52","Type":"ContainerStarted","Data":"c9e5f60b01729692f36bfb0f367c2410bf29b4c8c1a0dd548fc47730390bade5"} Dec 04 01:08:09 crc kubenswrapper[4764]: I1204 01:08:09.012103 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.012075892 podStartE2EDuration="4.012075892s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:08.997000351 +0000 UTC m=+5224.758324802" watchObservedRunningTime="2025-12-04 01:08:09.012075892 +0000 UTC m=+5224.773400343" Dec 04 01:08:09 crc kubenswrapper[4764]: I1204 01:08:09.028415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.028394475 podStartE2EDuration="4.028394475s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:09.026710983 +0000 UTC m=+5224.788035484" watchObservedRunningTime="2025-12-04 01:08:09.028394475 +0000 UTC m=+5224.789718886" Dec 04 01:08:09 crc kubenswrapper[4764]: I1204 01:08:09.049968 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=4.049938066 podStartE2EDuration="4.049938066s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:09.0431966 +0000 UTC m=+5224.804521051" watchObservedRunningTime="2025-12-04 01:08:09.049938066 +0000 UTC m=+5224.811262517" Dec 04 01:08:09 crc kubenswrapper[4764]: I1204 01:08:09.068176 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.068162755 podStartE2EDuration="4.068162755s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:09.064032523 +0000 UTC m=+5224.825356934" watchObservedRunningTime="2025-12-04 01:08:09.068162755 +0000 UTC m=+5224.829487166" Dec 04 01:08:09 crc kubenswrapper[4764]: I1204 01:08:09.170781 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.004615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7f86b426-5476-4395-a2f4-7b9ba3dead52","Type":"ContainerStarted","Data":"5e66d4df436504c6a73d22692e124c8b47731c1c3abfb589319d0f9937689d54"} Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.004675 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"7f86b426-5476-4395-a2f4-7b9ba3dead52","Type":"ContainerStarted","Data":"1adb8d052a449a8cbeb2e4c351dffcb284d49b1daf5a304a3c0c19282f1ee53e"} Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.008065 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1a9830a9-8b4b-443e-bca0-4b70d281dfb9","Type":"ContainerStarted","Data":"90619d197f025c3dc17d6bb37f45336653deb7def6896ff9b29e4f1636073b5d"} Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.008136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1a9830a9-8b4b-443e-bca0-4b70d281dfb9","Type":"ContainerStarted","Data":"e7569e4f88a6bf9ea1819953736bf2e9ca2fa7a50c4a69e24a1c7c9db6ec4b84"} Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.008157 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1a9830a9-8b4b-443e-bca0-4b70d281dfb9","Type":"ContainerStarted","Data":"d72e7d132f8ba489149f044472889cbbbd743ce2f1754228efb7dc04ba5a7010"} Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.048210 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=5.048191431 podStartE2EDuration="5.048191431s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:10.036296358 +0000 UTC m=+5225.797620809" watchObservedRunningTime="2025-12-04 01:08:10.048191431 +0000 UTC m=+5225.809515852" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.068004 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.074913 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.074883429 podStartE2EDuration="5.074883429s" podCreationTimestamp="2025-12-04 01:08:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:10.061786577 +0000 UTC m=+5225.823111088" watchObservedRunningTime="2025-12-04 01:08:10.074883429 +0000 UTC m=+5225.836207880" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.096428 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.109941 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.145367 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.312098 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.320837 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:10 crc kubenswrapper[4764]: I1204 01:08:10.383809 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:11 crc kubenswrapper[4764]: I1204 01:08:11.028303 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.096622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.108467 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.110010 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.311031 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.320812 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.383602 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.395382 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.397072 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.398807 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.418227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.554432 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxh47\" (UniqueName: \"kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.554477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.554769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.554926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.656252 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxh47\" (UniqueName: \"kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.656655 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.657658 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.658652 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.658893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.659509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.659650 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.684650 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxh47\" (UniqueName: \"kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47\") pod \"dnsmasq-dns-5f7fbb8cf-q56kx\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:12 crc kubenswrapper[4764]: I1204 01:08:12.716750 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.169020 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.180667 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:13 crc kubenswrapper[4764]: W1204 01:08:13.200710 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25c23bd4_6f49_4036_9450_5d0313b4babb.slice/crio-b91398bcf9f8e74b5cd2008645f88e636e5d1816c0c177152ac32a27ee7d6364 WatchSource:0}: Error finding container b91398bcf9f8e74b5cd2008645f88e636e5d1816c0c177152ac32a27ee7d6364: Status 404 returned error can't find the container with id b91398bcf9f8e74b5cd2008645f88e636e5d1816c0c177152ac32a27ee7d6364 Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.213235 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.245258 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.376883 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.392357 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.426016 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.452019 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.465930 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.818544 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.852948 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.854521 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.857997 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 01:08:13 crc kubenswrapper[4764]: I1204 01:08:13.865522 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.032221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.032286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.032344 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.032422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfjc\" (UniqueName: \"kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.032446 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.053954 4764 generic.go:334] "Generic (PLEG): container finished" podID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerID="b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14" exitCode=0 Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.054040 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" event={"ID":"25c23bd4-6f49-4036-9450-5d0313b4babb","Type":"ContainerDied","Data":"b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14"} Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.054073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" event={"ID":"25c23bd4-6f49-4036-9450-5d0313b4babb","Type":"ContainerStarted","Data":"b91398bcf9f8e74b5cd2008645f88e636e5d1816c0c177152ac32a27ee7d6364"} Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.125539 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.133471 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.133582 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfjc\" (UniqueName: \"kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.133624 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.133660 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.133702 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.134674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.134704 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.134789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.135095 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.136386 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.154592 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfjc\" (UniqueName: \"kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc\") pod \"dnsmasq-dns-64b69c8799-g4kzf\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.183284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:14 crc kubenswrapper[4764]: I1204 01:08:14.652327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:14 crc kubenswrapper[4764]: W1204 01:08:14.652971 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cbf8936_8e2d_43b1_8fe2_dd85b68354c7.slice/crio-0100dd2be4167d3cc7d6a2bdd9fead3fb302edb1cadca005f3ddfd9842bb5e82 WatchSource:0}: Error finding container 0100dd2be4167d3cc7d6a2bdd9fead3fb302edb1cadca005f3ddfd9842bb5e82: Status 404 returned error can't find the container with id 0100dd2be4167d3cc7d6a2bdd9fead3fb302edb1cadca005f3ddfd9842bb5e82 Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.064443 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" event={"ID":"25c23bd4-6f49-4036-9450-5d0313b4babb","Type":"ContainerStarted","Data":"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e"} Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.064620 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="dnsmasq-dns" containerID="cri-o://b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e" gracePeriod=10 Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.064966 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.070368 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerID="c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4" exitCode=0 Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.072922 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" event={"ID":"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7","Type":"ContainerDied","Data":"c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4"} Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.073543 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" event={"ID":"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7","Type":"ContainerStarted","Data":"0100dd2be4167d3cc7d6a2bdd9fead3fb302edb1cadca005f3ddfd9842bb5e82"} Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.089137 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" podStartSLOduration=3.089112902 podStartE2EDuration="3.089112902s" podCreationTimestamp="2025-12-04 01:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:15.08863696 +0000 UTC m=+5230.849961391" watchObservedRunningTime="2025-12-04 01:08:15.089112902 +0000 UTC m=+5230.850437323" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.523898 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.561882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc\") pod \"25c23bd4-6f49-4036-9450-5d0313b4babb\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.562015 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb\") pod \"25c23bd4-6f49-4036-9450-5d0313b4babb\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.562100 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxh47\" (UniqueName: \"kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47\") pod \"25c23bd4-6f49-4036-9450-5d0313b4babb\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.562124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config\") pod \"25c23bd4-6f49-4036-9450-5d0313b4babb\" (UID: \"25c23bd4-6f49-4036-9450-5d0313b4babb\") " Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.570806 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47" (OuterVolumeSpecName: "kube-api-access-lxh47") pod "25c23bd4-6f49-4036-9450-5d0313b4babb" (UID: "25c23bd4-6f49-4036-9450-5d0313b4babb"). InnerVolumeSpecName "kube-api-access-lxh47". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.611261 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config" (OuterVolumeSpecName: "config") pod "25c23bd4-6f49-4036-9450-5d0313b4babb" (UID: "25c23bd4-6f49-4036-9450-5d0313b4babb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.616369 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25c23bd4-6f49-4036-9450-5d0313b4babb" (UID: "25c23bd4-6f49-4036-9450-5d0313b4babb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.629843 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25c23bd4-6f49-4036-9450-5d0313b4babb" (UID: "25c23bd4-6f49-4036-9450-5d0313b4babb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.668581 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.668629 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.668654 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxh47\" (UniqueName: \"kubernetes.io/projected/25c23bd4-6f49-4036-9450-5d0313b4babb-kube-api-access-lxh47\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:15 crc kubenswrapper[4764]: I1204 01:08:15.668671 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25c23bd4-6f49-4036-9450-5d0313b4babb-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.100681 4764 generic.go:334] "Generic (PLEG): container finished" podID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerID="b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e" exitCode=0 Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.100853 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.100884 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" event={"ID":"25c23bd4-6f49-4036-9450-5d0313b4babb","Type":"ContainerDied","Data":"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e"} Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.101238 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f7fbb8cf-q56kx" event={"ID":"25c23bd4-6f49-4036-9450-5d0313b4babb","Type":"ContainerDied","Data":"b91398bcf9f8e74b5cd2008645f88e636e5d1816c0c177152ac32a27ee7d6364"} Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.101268 4764 scope.go:117] "RemoveContainer" containerID="b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.108378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" event={"ID":"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7","Type":"ContainerStarted","Data":"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd"} Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.108640 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.150345 4764 scope.go:117] "RemoveContainer" containerID="b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.164192 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" podStartSLOduration=3.163668469 podStartE2EDuration="3.163668469s" podCreationTimestamp="2025-12-04 01:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:16.143371438 +0000 UTC m=+5231.904695889" watchObservedRunningTime="2025-12-04 01:08:16.163668469 +0000 UTC m=+5231.924992890" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.182031 4764 scope.go:117] "RemoveContainer" containerID="b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e" Dec 04 01:08:16 crc kubenswrapper[4764]: E1204 01:08:16.183260 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e\": container with ID starting with b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e not found: ID does not exist" containerID="b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.183297 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e"} err="failed to get container status \"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e\": rpc error: code = NotFound desc = could not find container \"b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e\": container with ID starting with b7f9b51e7638e9fe2fbab2f5c46fe21fe5517f0bfcd75af8083e0111390b956e not found: ID does not exist" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.183372 4764 scope.go:117] "RemoveContainer" containerID="b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14" Dec 04 01:08:16 crc kubenswrapper[4764]: E1204 01:08:16.186813 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14\": container with ID starting with b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14 not found: ID does not exist" containerID="b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.186876 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14"} err="failed to get container status \"b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14\": rpc error: code = NotFound desc = could not find container \"b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14\": container with ID starting with b60e6ba661d363d80ee0b9c2c25cf289198f3ef3b42aa90fa3e5f60bfe4dba14 not found: ID does not exist" Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.209155 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.221742 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f7fbb8cf-q56kx"] Dec 04 01:08:16 crc kubenswrapper[4764]: I1204 01:08:16.561432 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" path="/var/lib/kubelet/pods/25c23bd4-6f49-4036-9450-5d0313b4babb/volumes" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.215540 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 04 01:08:17 crc kubenswrapper[4764]: E1204 01:08:17.216132 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="init" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.216266 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="init" Dec 04 01:08:17 crc kubenswrapper[4764]: E1204 01:08:17.216401 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="dnsmasq-dns" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.216503 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="dnsmasq-dns" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.216945 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c23bd4-6f49-4036-9450-5d0313b4babb" containerName="dnsmasq-dns" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.217831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.222939 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.225607 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.291485 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.291577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.291660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txvlf\" (UniqueName: \"kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.393371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.393500 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.393787 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txvlf\" (UniqueName: \"kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.397267 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.397302 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5b41eb331f1b9fdd09b4c1fcf3007f09d8d8d09f1d313a0aa6120167fb4fb226/globalmount\"" pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.403386 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.422627 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txvlf\" (UniqueName: \"kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.440239 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") pod \"ovn-copy-data\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " pod="openstack/ovn-copy-data" Dec 04 01:08:17 crc kubenswrapper[4764]: I1204 01:08:17.547684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 04 01:08:18 crc kubenswrapper[4764]: I1204 01:08:18.155096 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 04 01:08:18 crc kubenswrapper[4764]: W1204 01:08:18.160089 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda70e1201_59bc_45ce_b002_848176bc24cf.slice/crio-447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a WatchSource:0}: Error finding container 447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a: Status 404 returned error can't find the container with id 447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a Dec 04 01:08:18 crc kubenswrapper[4764]: I1204 01:08:18.164560 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:08:19 crc kubenswrapper[4764]: I1204 01:08:19.145433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a70e1201-59bc-45ce-b002-848176bc24cf","Type":"ContainerStarted","Data":"a094c7a69f206ccbfa8fef0e881fa8c00abdc49bb06aebbdef92bbdb3d484e3d"} Dec 04 01:08:19 crc kubenswrapper[4764]: I1204 01:08:19.146036 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a70e1201-59bc-45ce-b002-848176bc24cf","Type":"ContainerStarted","Data":"447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a"} Dec 04 01:08:19 crc kubenswrapper[4764]: I1204 01:08:19.176205 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.6979900150000002 podStartE2EDuration="3.176184032s" podCreationTimestamp="2025-12-04 01:08:16 +0000 UTC" firstStartedPulling="2025-12-04 01:08:18.163966462 +0000 UTC m=+5233.925290903" lastFinishedPulling="2025-12-04 01:08:18.642160469 +0000 UTC m=+5234.403484920" observedRunningTime="2025-12-04 01:08:19.167537299 +0000 UTC m=+5234.928861720" watchObservedRunningTime="2025-12-04 01:08:19.176184032 +0000 UTC m=+5234.937508453" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.659405 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.664009 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.685703 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.860869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqq2d\" (UniqueName: \"kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.860931 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.861004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.868762 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.868838 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.962257 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqq2d\" (UniqueName: \"kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.962335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.962394 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.963150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.963273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.987935 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqq2d\" (UniqueName: \"kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d\") pod \"certified-operators-6wkhn\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:20 crc kubenswrapper[4764]: I1204 01:08:20.989944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:21 crc kubenswrapper[4764]: I1204 01:08:21.576025 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:21 crc kubenswrapper[4764]: W1204 01:08:21.583884 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcb42619_9693_474e_a889_5ea1c2b6decc.slice/crio-542a2a3b193c1f7861f4058ce08e57a169fd357debc253951547bc791387820a WatchSource:0}: Error finding container 542a2a3b193c1f7861f4058ce08e57a169fd357debc253951547bc791387820a: Status 404 returned error can't find the container with id 542a2a3b193c1f7861f4058ce08e57a169fd357debc253951547bc791387820a Dec 04 01:08:22 crc kubenswrapper[4764]: I1204 01:08:22.180596 4764 generic.go:334] "Generic (PLEG): container finished" podID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerID="5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052" exitCode=0 Dec 04 01:08:22 crc kubenswrapper[4764]: I1204 01:08:22.180683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerDied","Data":"5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052"} Dec 04 01:08:22 crc kubenswrapper[4764]: I1204 01:08:22.180923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerStarted","Data":"542a2a3b193c1f7861f4058ce08e57a169fd357debc253951547bc791387820a"} Dec 04 01:08:23 crc kubenswrapper[4764]: I1204 01:08:23.192418 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerStarted","Data":"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025"} Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.185378 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.207794 4764 generic.go:334] "Generic (PLEG): container finished" podID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerID="3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025" exitCode=0 Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.207865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerDied","Data":"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025"} Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.306917 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.307191 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="dnsmasq-dns" containerID="cri-o://fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31" gracePeriod=10 Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.723366 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.727586 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.730660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4xn67" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.730964 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.731531 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.762794 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.824750 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.858476 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-scripts\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.858991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jcs9\" (UniqueName: \"kubernetes.io/projected/22754f83-6ba0-48f6-82f7-5a28b5c5498c-kube-api-access-9jcs9\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.859022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22754f83-6ba0-48f6-82f7-5a28b5c5498c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.859042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-config\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.859097 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22754f83-6ba0-48f6-82f7-5a28b5c5498c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.959989 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc\") pod \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960064 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmld8\" (UniqueName: \"kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8\") pod \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960092 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config\") pod \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\" (UID: \"ad65f8fe-26a5-4700-8a13-a535dbcf5c73\") " Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960330 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-scripts\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960355 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jcs9\" (UniqueName: \"kubernetes.io/projected/22754f83-6ba0-48f6-82f7-5a28b5c5498c-kube-api-access-9jcs9\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22754f83-6ba0-48f6-82f7-5a28b5c5498c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-config\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.960486 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22754f83-6ba0-48f6-82f7-5a28b5c5498c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.961410 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-scripts\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.961665 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/22754f83-6ba0-48f6-82f7-5a28b5c5498c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.962111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22754f83-6ba0-48f6-82f7-5a28b5c5498c-config\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.965883 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22754f83-6ba0-48f6-82f7-5a28b5c5498c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.970665 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8" (OuterVolumeSpecName: "kube-api-access-zmld8") pod "ad65f8fe-26a5-4700-8a13-a535dbcf5c73" (UID: "ad65f8fe-26a5-4700-8a13-a535dbcf5c73"). InnerVolumeSpecName "kube-api-access-zmld8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:24 crc kubenswrapper[4764]: I1204 01:08:24.991101 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jcs9\" (UniqueName: \"kubernetes.io/projected/22754f83-6ba0-48f6-82f7-5a28b5c5498c-kube-api-access-9jcs9\") pod \"ovn-northd-0\" (UID: \"22754f83-6ba0-48f6-82f7-5a28b5c5498c\") " pod="openstack/ovn-northd-0" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.012125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config" (OuterVolumeSpecName: "config") pod "ad65f8fe-26a5-4700-8a13-a535dbcf5c73" (UID: "ad65f8fe-26a5-4700-8a13-a535dbcf5c73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.032651 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ad65f8fe-26a5-4700-8a13-a535dbcf5c73" (UID: "ad65f8fe-26a5-4700-8a13-a535dbcf5c73"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.062462 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmld8\" (UniqueName: \"kubernetes.io/projected/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-kube-api-access-zmld8\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.062487 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.063042 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ad65f8fe-26a5-4700-8a13-a535dbcf5c73-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.156289 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.227006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerStarted","Data":"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9"} Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.231793 4764 generic.go:334] "Generic (PLEG): container finished" podID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerID="fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31" exitCode=0 Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.231852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" event={"ID":"ad65f8fe-26a5-4700-8a13-a535dbcf5c73","Type":"ContainerDied","Data":"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31"} Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.231885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" event={"ID":"ad65f8fe-26a5-4700-8a13-a535dbcf5c73","Type":"ContainerDied","Data":"b1640544bb985c236cb08f4facff874e89cf4fb7748480abd4aa15b094d58f7c"} Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.231910 4764 scope.go:117] "RemoveContainer" containerID="fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.231903 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f79bf7859-gwrqw" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.247874 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6wkhn" podStartSLOduration=2.782557502 podStartE2EDuration="5.247852397s" podCreationTimestamp="2025-12-04 01:08:20 +0000 UTC" firstStartedPulling="2025-12-04 01:08:22.182319587 +0000 UTC m=+5237.943644008" lastFinishedPulling="2025-12-04 01:08:24.647614492 +0000 UTC m=+5240.408938903" observedRunningTime="2025-12-04 01:08:25.246658138 +0000 UTC m=+5241.007982549" watchObservedRunningTime="2025-12-04 01:08:25.247852397 +0000 UTC m=+5241.009176808" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.286856 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.297963 4764 scope.go:117] "RemoveContainer" containerID="eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.299387 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f79bf7859-gwrqw"] Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.321213 4764 scope.go:117] "RemoveContainer" containerID="fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31" Dec 04 01:08:25 crc kubenswrapper[4764]: E1204 01:08:25.323022 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31\": container with ID starting with fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31 not found: ID does not exist" containerID="fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.323142 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31"} err="failed to get container status \"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31\": rpc error: code = NotFound desc = could not find container \"fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31\": container with ID starting with fee93ccab9a8e3152390bfc9430e8ba17d1f2613f9046f04e980f1026a310e31 not found: ID does not exist" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.323177 4764 scope.go:117] "RemoveContainer" containerID="eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676" Dec 04 01:08:25 crc kubenswrapper[4764]: E1204 01:08:25.323513 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676\": container with ID starting with eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676 not found: ID does not exist" containerID="eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.323556 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676"} err="failed to get container status \"eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676\": rpc error: code = NotFound desc = could not find container \"eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676\": container with ID starting with eaa9fbc8e2c605507a67d832f90bf08520ac7a8da138bad8538effb6f6d4a676 not found: ID does not exist" Dec 04 01:08:25 crc kubenswrapper[4764]: I1204 01:08:25.651654 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 01:08:25 crc kubenswrapper[4764]: W1204 01:08:25.659527 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22754f83_6ba0_48f6_82f7_5a28b5c5498c.slice/crio-c5d13e7cbb45d5dc73f5363e752c4659b95e3bc7991365363e6281dc8b7bc23f WatchSource:0}: Error finding container c5d13e7cbb45d5dc73f5363e752c4659b95e3bc7991365363e6281dc8b7bc23f: Status 404 returned error can't find the container with id c5d13e7cbb45d5dc73f5363e752c4659b95e3bc7991365363e6281dc8b7bc23f Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.241954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22754f83-6ba0-48f6-82f7-5a28b5c5498c","Type":"ContainerStarted","Data":"75102e82edefdb8bcfa43ee26258f821e24ff63934354e320920c5f126f82d1c"} Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.242281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22754f83-6ba0-48f6-82f7-5a28b5c5498c","Type":"ContainerStarted","Data":"0fdc644d3ee99cf1289e40dfadb35910dc33e77bd35db5dd3ec29756a126dd9a"} Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.242301 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.242313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"22754f83-6ba0-48f6-82f7-5a28b5c5498c","Type":"ContainerStarted","Data":"c5d13e7cbb45d5dc73f5363e752c4659b95e3bc7991365363e6281dc8b7bc23f"} Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.264750 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.264703241 podStartE2EDuration="2.264703241s" podCreationTimestamp="2025-12-04 01:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:26.261175494 +0000 UTC m=+5242.022499935" watchObservedRunningTime="2025-12-04 01:08:26.264703241 +0000 UTC m=+5242.026027692" Dec 04 01:08:26 crc kubenswrapper[4764]: I1204 01:08:26.568533 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" path="/var/lib/kubelet/pods/ad65f8fe-26a5-4700-8a13-a535dbcf5c73/volumes" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.570558 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-lblj5"] Dec 04 01:08:29 crc kubenswrapper[4764]: E1204 01:08:29.571552 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="dnsmasq-dns" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.571570 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="dnsmasq-dns" Dec 04 01:08:29 crc kubenswrapper[4764]: E1204 01:08:29.571608 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="init" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.571618 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="init" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.571843 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad65f8fe-26a5-4700-8a13-a535dbcf5c73" containerName="dnsmasq-dns" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.572560 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.586018 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lblj5"] Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.674312 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0756-account-create-update-2l687"] Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.675418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.677192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.683353 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0756-account-create-update-2l687"] Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.760957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.761019 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9jp\" (UniqueName: \"kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.862422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.862475 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.862502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9jp\" (UniqueName: \"kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.862601 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6l5n\" (UniqueName: \"kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.863428 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.883500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9jp\" (UniqueName: \"kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp\") pod \"keystone-db-create-lblj5\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.891424 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.964250 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6l5n\" (UniqueName: \"kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.964346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.965360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.981632 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6l5n\" (UniqueName: \"kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n\") pod \"keystone-0756-account-create-update-2l687\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:29 crc kubenswrapper[4764]: I1204 01:08:29.995006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:30 crc kubenswrapper[4764]: I1204 01:08:30.434666 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-lblj5"] Dec 04 01:08:30 crc kubenswrapper[4764]: W1204 01:08:30.443822 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7bb7ec5_5f97_4a2f_9dca_20f8f38c48a1.slice/crio-364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3 WatchSource:0}: Error finding container 364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3: Status 404 returned error can't find the container with id 364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3 Dec 04 01:08:30 crc kubenswrapper[4764]: W1204 01:08:30.548710 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0263bc12_675e_4dbf_a401_6acda0d97f1c.slice/crio-b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a WatchSource:0}: Error finding container b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a: Status 404 returned error can't find the container with id b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a Dec 04 01:08:30 crc kubenswrapper[4764]: I1204 01:08:30.555370 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0756-account-create-update-2l687"] Dec 04 01:08:30 crc kubenswrapper[4764]: I1204 01:08:30.990713 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:30 crc kubenswrapper[4764]: I1204 01:08:30.990781 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.043225 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.293495 4764 generic.go:334] "Generic (PLEG): container finished" podID="0263bc12-675e-4dbf-a401-6acda0d97f1c" containerID="9f118afdcc1fad1d9a4385b345c9048296d3f0e90089966e80a3e11a8ae4da0d" exitCode=0 Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.293602 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0756-account-create-update-2l687" event={"ID":"0263bc12-675e-4dbf-a401-6acda0d97f1c","Type":"ContainerDied","Data":"9f118afdcc1fad1d9a4385b345c9048296d3f0e90089966e80a3e11a8ae4da0d"} Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.293694 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0756-account-create-update-2l687" event={"ID":"0263bc12-675e-4dbf-a401-6acda0d97f1c","Type":"ContainerStarted","Data":"b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a"} Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.295577 4764 generic.go:334] "Generic (PLEG): container finished" podID="e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" containerID="854a4bef5dc6e0fc8518e4ef4592640a0f782cb9a570bef229093b7030cc69cf" exitCode=0 Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.295632 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lblj5" event={"ID":"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1","Type":"ContainerDied","Data":"854a4bef5dc6e0fc8518e4ef4592640a0f782cb9a570bef229093b7030cc69cf"} Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.295670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lblj5" event={"ID":"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1","Type":"ContainerStarted","Data":"364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3"} Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.373577 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:31 crc kubenswrapper[4764]: I1204 01:08:31.424287 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.796267 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.801647 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.827018 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6l5n\" (UniqueName: \"kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n\") pod \"0263bc12-675e-4dbf-a401-6acda0d97f1c\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.827085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts\") pod \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.827234 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9jp\" (UniqueName: \"kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp\") pod \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\" (UID: \"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1\") " Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.827255 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts\") pod \"0263bc12-675e-4dbf-a401-6acda0d97f1c\" (UID: \"0263bc12-675e-4dbf-a401-6acda0d97f1c\") " Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.828256 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0263bc12-675e-4dbf-a401-6acda0d97f1c" (UID: "0263bc12-675e-4dbf-a401-6acda0d97f1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.828602 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" (UID: "e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.833316 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp" (OuterVolumeSpecName: "kube-api-access-wl9jp") pod "e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" (UID: "e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1"). InnerVolumeSpecName "kube-api-access-wl9jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.838586 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n" (OuterVolumeSpecName: "kube-api-access-s6l5n") pod "0263bc12-675e-4dbf-a401-6acda0d97f1c" (UID: "0263bc12-675e-4dbf-a401-6acda0d97f1c"). InnerVolumeSpecName "kube-api-access-s6l5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.930349 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.930376 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9jp\" (UniqueName: \"kubernetes.io/projected/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1-kube-api-access-wl9jp\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.930389 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0263bc12-675e-4dbf-a401-6acda0d97f1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:32 crc kubenswrapper[4764]: I1204 01:08:32.930397 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6l5n\" (UniqueName: \"kubernetes.io/projected/0263bc12-675e-4dbf-a401-6acda0d97f1c-kube-api-access-s6l5n\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.318397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-lblj5" event={"ID":"e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1","Type":"ContainerDied","Data":"364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3"} Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.318458 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-lblj5" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.318502 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364689ad70e6f062a0325ff1fb286eb82df51f9218366866ab4dc72ce8b149b3" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.320420 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0756-account-create-update-2l687" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.320433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0756-account-create-update-2l687" event={"ID":"0263bc12-675e-4dbf-a401-6acda0d97f1c","Type":"ContainerDied","Data":"b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a"} Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.320494 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b103a15f2cb8e2ac93bc68563f47af2576eef090ee130d38b129b9d97789ab8a" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.320586 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6wkhn" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="registry-server" containerID="cri-o://bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9" gracePeriod=2 Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.866106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.947333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqq2d\" (UniqueName: \"kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d\") pod \"bcb42619-9693-474e-a889-5ea1c2b6decc\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.947373 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities\") pod \"bcb42619-9693-474e-a889-5ea1c2b6decc\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.947425 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content\") pod \"bcb42619-9693-474e-a889-5ea1c2b6decc\" (UID: \"bcb42619-9693-474e-a889-5ea1c2b6decc\") " Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.948452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities" (OuterVolumeSpecName: "utilities") pod "bcb42619-9693-474e-a889-5ea1c2b6decc" (UID: "bcb42619-9693-474e-a889-5ea1c2b6decc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:08:33 crc kubenswrapper[4764]: I1204 01:08:33.951780 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d" (OuterVolumeSpecName: "kube-api-access-sqq2d") pod "bcb42619-9693-474e-a889-5ea1c2b6decc" (UID: "bcb42619-9693-474e-a889-5ea1c2b6decc"). InnerVolumeSpecName "kube-api-access-sqq2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.005489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcb42619-9693-474e-a889-5ea1c2b6decc" (UID: "bcb42619-9693-474e-a889-5ea1c2b6decc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.048745 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqq2d\" (UniqueName: \"kubernetes.io/projected/bcb42619-9693-474e-a889-5ea1c2b6decc-kube-api-access-sqq2d\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.048776 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.048785 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcb42619-9693-474e-a889-5ea1c2b6decc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.334796 4764 generic.go:334] "Generic (PLEG): container finished" podID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerID="bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9" exitCode=0 Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.334867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerDied","Data":"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9"} Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.334894 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6wkhn" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.334924 4764 scope.go:117] "RemoveContainer" containerID="bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.334906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6wkhn" event={"ID":"bcb42619-9693-474e-a889-5ea1c2b6decc","Type":"ContainerDied","Data":"542a2a3b193c1f7861f4058ce08e57a169fd357debc253951547bc791387820a"} Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.375312 4764 scope.go:117] "RemoveContainer" containerID="3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.411494 4764 scope.go:117] "RemoveContainer" containerID="5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.422512 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.432549 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6wkhn"] Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.460858 4764 scope.go:117] "RemoveContainer" containerID="bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9" Dec 04 01:08:34 crc kubenswrapper[4764]: E1204 01:08:34.463085 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9\": container with ID starting with bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9 not found: ID does not exist" containerID="bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.463127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9"} err="failed to get container status \"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9\": rpc error: code = NotFound desc = could not find container \"bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9\": container with ID starting with bf51fc504b45bb1b844abc5cf23a512c79db62986e9399921a8375e03bf8b0b9 not found: ID does not exist" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.463155 4764 scope.go:117] "RemoveContainer" containerID="3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025" Dec 04 01:08:34 crc kubenswrapper[4764]: E1204 01:08:34.463414 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025\": container with ID starting with 3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025 not found: ID does not exist" containerID="3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.463440 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025"} err="failed to get container status \"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025\": rpc error: code = NotFound desc = could not find container \"3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025\": container with ID starting with 3d088676438c8b42fd28dc0c3b11ad495f3be9053e6dede0a616a13a04a55025 not found: ID does not exist" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.463457 4764 scope.go:117] "RemoveContainer" containerID="5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052" Dec 04 01:08:34 crc kubenswrapper[4764]: E1204 01:08:34.463818 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052\": container with ID starting with 5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052 not found: ID does not exist" containerID="5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.463888 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052"} err="failed to get container status \"5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052\": rpc error: code = NotFound desc = could not find container \"5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052\": container with ID starting with 5f35f44573112694270ec88bc364584f4306494f8945dd210db65692d03aa052 not found: ID does not exist" Dec 04 01:08:34 crc kubenswrapper[4764]: I1204 01:08:34.577642 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" path="/var/lib/kubelet/pods/bcb42619-9693-474e-a889-5ea1c2b6decc/volumes" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.187324 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vqk6r"] Dec 04 01:08:35 crc kubenswrapper[4764]: E1204 01:08:35.189350 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="extract-utilities" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.189428 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="extract-utilities" Dec 04 01:08:35 crc kubenswrapper[4764]: E1204 01:08:35.189490 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" containerName="mariadb-database-create" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.189538 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" containerName="mariadb-database-create" Dec 04 01:08:35 crc kubenswrapper[4764]: E1204 01:08:35.189595 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0263bc12-675e-4dbf-a401-6acda0d97f1c" containerName="mariadb-account-create-update" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.189656 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0263bc12-675e-4dbf-a401-6acda0d97f1c" containerName="mariadb-account-create-update" Dec 04 01:08:35 crc kubenswrapper[4764]: E1204 01:08:35.189711 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="extract-content" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.189778 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="extract-content" Dec 04 01:08:35 crc kubenswrapper[4764]: E1204 01:08:35.189829 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="registry-server" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.189874 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="registry-server" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.190062 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0263bc12-675e-4dbf-a401-6acda0d97f1c" containerName="mariadb-account-create-update" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.190136 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcb42619-9693-474e-a889-5ea1c2b6decc" containerName="registry-server" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.190201 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" containerName="mariadb-database-create" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.190786 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.196960 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.198712 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lzmjr" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.199434 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.199971 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.206367 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vqk6r"] Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.245596 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.270062 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2scb\" (UniqueName: \"kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.270149 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.270421 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.371527 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.371648 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.371706 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2scb\" (UniqueName: \"kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.380351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.380510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.387544 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2scb\" (UniqueName: \"kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb\") pod \"keystone-db-sync-vqk6r\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.508947 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:35 crc kubenswrapper[4764]: W1204 01:08:35.955805 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb96a1e7_672e_4350_ad6f_c3802c61809c.slice/crio-1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe WatchSource:0}: Error finding container 1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe: Status 404 returned error can't find the container with id 1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe Dec 04 01:08:35 crc kubenswrapper[4764]: I1204 01:08:35.957592 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vqk6r"] Dec 04 01:08:36 crc kubenswrapper[4764]: I1204 01:08:36.352948 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vqk6r" event={"ID":"db96a1e7-672e-4350-ad6f-c3802c61809c","Type":"ContainerStarted","Data":"054ca7e537e8dcbed4eac89cc315a82e736a5cf163fb6cc4e51fa032c82b091b"} Dec 04 01:08:36 crc kubenswrapper[4764]: I1204 01:08:36.353334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vqk6r" event={"ID":"db96a1e7-672e-4350-ad6f-c3802c61809c","Type":"ContainerStarted","Data":"1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe"} Dec 04 01:08:36 crc kubenswrapper[4764]: I1204 01:08:36.373035 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vqk6r" podStartSLOduration=1.373017675 podStartE2EDuration="1.373017675s" podCreationTimestamp="2025-12-04 01:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:36.370276207 +0000 UTC m=+5252.131600618" watchObservedRunningTime="2025-12-04 01:08:36.373017675 +0000 UTC m=+5252.134342086" Dec 04 01:08:38 crc kubenswrapper[4764]: I1204 01:08:38.392512 4764 generic.go:334] "Generic (PLEG): container finished" podID="db96a1e7-672e-4350-ad6f-c3802c61809c" containerID="054ca7e537e8dcbed4eac89cc315a82e736a5cf163fb6cc4e51fa032c82b091b" exitCode=0 Dec 04 01:08:38 crc kubenswrapper[4764]: I1204 01:08:38.392577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vqk6r" event={"ID":"db96a1e7-672e-4350-ad6f-c3802c61809c","Type":"ContainerDied","Data":"054ca7e537e8dcbed4eac89cc315a82e736a5cf163fb6cc4e51fa032c82b091b"} Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.773253 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.951350 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data\") pod \"db96a1e7-672e-4350-ad6f-c3802c61809c\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.952108 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle\") pod \"db96a1e7-672e-4350-ad6f-c3802c61809c\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.952562 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2scb\" (UniqueName: \"kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb\") pod \"db96a1e7-672e-4350-ad6f-c3802c61809c\" (UID: \"db96a1e7-672e-4350-ad6f-c3802c61809c\") " Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.959734 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb" (OuterVolumeSpecName: "kube-api-access-j2scb") pod "db96a1e7-672e-4350-ad6f-c3802c61809c" (UID: "db96a1e7-672e-4350-ad6f-c3802c61809c"). InnerVolumeSpecName "kube-api-access-j2scb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.983046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db96a1e7-672e-4350-ad6f-c3802c61809c" (UID: "db96a1e7-672e-4350-ad6f-c3802c61809c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:39 crc kubenswrapper[4764]: I1204 01:08:39.996788 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data" (OuterVolumeSpecName: "config-data") pod "db96a1e7-672e-4350-ad6f-c3802c61809c" (UID: "db96a1e7-672e-4350-ad6f-c3802c61809c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.054942 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2scb\" (UniqueName: \"kubernetes.io/projected/db96a1e7-672e-4350-ad6f-c3802c61809c-kube-api-access-j2scb\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.054978 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.054987 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db96a1e7-672e-4350-ad6f-c3802c61809c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.413140 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vqk6r" event={"ID":"db96a1e7-672e-4350-ad6f-c3802c61809c","Type":"ContainerDied","Data":"1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe"} Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.413206 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1542f547447a02934c8216cfb298398e50463f4140bf68d9e9e1698709e7effe" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.413231 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vqk6r" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.663037 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:08:40 crc kubenswrapper[4764]: E1204 01:08:40.663339 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db96a1e7-672e-4350-ad6f-c3802c61809c" containerName="keystone-db-sync" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.663350 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="db96a1e7-672e-4350-ad6f-c3802c61809c" containerName="keystone-db-sync" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.663493 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="db96a1e7-672e-4350-ad6f-c3802c61809c" containerName="keystone-db-sync" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.664241 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.665727 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.665806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.665833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.665873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzp67\" (UniqueName: \"kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.665901 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.694675 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.719693 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r9lsb"] Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.720724 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.723312 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lzmjr" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.732570 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.733400 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.734126 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.734244 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.743885 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r9lsb"] Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzp67\" (UniqueName: \"kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767502 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767549 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7p4b\" (UniqueName: \"kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767574 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767597 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767684 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767768 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.767917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.768798 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.769017 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.769414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.774308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.790640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzp67\" (UniqueName: \"kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67\") pod \"dnsmasq-dns-7cb8758895-5pcsh\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869225 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869396 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7p4b\" (UniqueName: \"kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.869481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.873320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.877964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.879206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.879585 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.881788 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.895232 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7p4b\" (UniqueName: \"kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b\") pod \"keystone-bootstrap-r9lsb\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:40 crc kubenswrapper[4764]: I1204 01:08:40.977123 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:41 crc kubenswrapper[4764]: I1204 01:08:41.040185 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:41 crc kubenswrapper[4764]: I1204 01:08:41.421290 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:08:41 crc kubenswrapper[4764]: W1204 01:08:41.423736 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd178584a_4dc5_4161_8bd0_c55440c48f22.slice/crio-6b2dfa295b387f9816f710c2f5663af394a9459238020d1d376fb4fc75f78d37 WatchSource:0}: Error finding container 6b2dfa295b387f9816f710c2f5663af394a9459238020d1d376fb4fc75f78d37: Status 404 returned error can't find the container with id 6b2dfa295b387f9816f710c2f5663af394a9459238020d1d376fb4fc75f78d37 Dec 04 01:08:41 crc kubenswrapper[4764]: I1204 01:08:41.548845 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r9lsb"] Dec 04 01:08:41 crc kubenswrapper[4764]: W1204 01:08:41.548994 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod256f0007_79b0_4f71_bd5c_015656d7805c.slice/crio-99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786 WatchSource:0}: Error finding container 99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786: Status 404 returned error can't find the container with id 99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786 Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.434140 4764 generic.go:334] "Generic (PLEG): container finished" podID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerID="8b5214240777222dca4fc6835fb0405b4652f16ed9c3c611628e44553bfb098c" exitCode=0 Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.434221 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" event={"ID":"d178584a-4dc5-4161-8bd0-c55440c48f22","Type":"ContainerDied","Data":"8b5214240777222dca4fc6835fb0405b4652f16ed9c3c611628e44553bfb098c"} Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.434259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" event={"ID":"d178584a-4dc5-4161-8bd0-c55440c48f22","Type":"ContainerStarted","Data":"6b2dfa295b387f9816f710c2f5663af394a9459238020d1d376fb4fc75f78d37"} Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.436077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9lsb" event={"ID":"256f0007-79b0-4f71-bd5c-015656d7805c","Type":"ContainerStarted","Data":"bd33bb6a49ac286e9942db465fb51ac9ad537c6fa520686436cd674819daed2e"} Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.436098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9lsb" event={"ID":"256f0007-79b0-4f71-bd5c-015656d7805c","Type":"ContainerStarted","Data":"99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786"} Dec 04 01:08:42 crc kubenswrapper[4764]: I1204 01:08:42.486000 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r9lsb" podStartSLOduration=2.4859764699999998 podStartE2EDuration="2.48597647s" podCreationTimestamp="2025-12-04 01:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:42.47545327 +0000 UTC m=+5258.236777681" watchObservedRunningTime="2025-12-04 01:08:42.48597647 +0000 UTC m=+5258.247300901" Dec 04 01:08:43 crc kubenswrapper[4764]: I1204 01:08:43.450590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" event={"ID":"d178584a-4dc5-4161-8bd0-c55440c48f22","Type":"ContainerStarted","Data":"6f22abcb9c2914f432fb55c45c2999cb1f1716a519fe60db84fa4d49f779a659"} Dec 04 01:08:43 crc kubenswrapper[4764]: I1204 01:08:43.450911 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:43 crc kubenswrapper[4764]: I1204 01:08:43.486682 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" podStartSLOduration=3.486664235 podStartE2EDuration="3.486664235s" podCreationTimestamp="2025-12-04 01:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:43.483466686 +0000 UTC m=+5259.244791137" watchObservedRunningTime="2025-12-04 01:08:43.486664235 +0000 UTC m=+5259.247988646" Dec 04 01:08:45 crc kubenswrapper[4764]: I1204 01:08:45.473767 4764 generic.go:334] "Generic (PLEG): container finished" podID="256f0007-79b0-4f71-bd5c-015656d7805c" containerID="bd33bb6a49ac286e9942db465fb51ac9ad537c6fa520686436cd674819daed2e" exitCode=0 Dec 04 01:08:45 crc kubenswrapper[4764]: I1204 01:08:45.473827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9lsb" event={"ID":"256f0007-79b0-4f71-bd5c-015656d7805c","Type":"ContainerDied","Data":"bd33bb6a49ac286e9942db465fb51ac9ad537c6fa520686436cd674819daed2e"} Dec 04 01:08:46 crc kubenswrapper[4764]: I1204 01:08:46.876870 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082064 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082158 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082183 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082277 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7p4b\" (UniqueName: \"kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.082357 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts\") pod \"256f0007-79b0-4f71-bd5c-015656d7805c\" (UID: \"256f0007-79b0-4f71-bd5c-015656d7805c\") " Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.089471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.095816 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts" (OuterVolumeSpecName: "scripts") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.095922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b" (OuterVolumeSpecName: "kube-api-access-x7p4b") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "kube-api-access-x7p4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.096011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.109145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.122751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data" (OuterVolumeSpecName: "config-data") pod "256f0007-79b0-4f71-bd5c-015656d7805c" (UID: "256f0007-79b0-4f71-bd5c-015656d7805c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183915 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183946 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183955 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183965 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183973 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7p4b\" (UniqueName: \"kubernetes.io/projected/256f0007-79b0-4f71-bd5c-015656d7805c-kube-api-access-x7p4b\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.183982 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/256f0007-79b0-4f71-bd5c-015656d7805c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.492048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9lsb" event={"ID":"256f0007-79b0-4f71-bd5c-015656d7805c","Type":"ContainerDied","Data":"99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786"} Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.492552 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99e69cc44352d4e2a6568d0b4463dafe89ed3eaf04a8ba17d20707a419ac4786" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.492128 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9lsb" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.676595 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r9lsb"] Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.688128 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r9lsb"] Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.791707 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9vcrj"] Dec 04 01:08:47 crc kubenswrapper[4764]: E1204 01:08:47.792513 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="256f0007-79b0-4f71-bd5c-015656d7805c" containerName="keystone-bootstrap" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.792623 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="256f0007-79b0-4f71-bd5c-015656d7805c" containerName="keystone-bootstrap" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.793026 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="256f0007-79b0-4f71-bd5c-015656d7805c" containerName="keystone-bootstrap" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.793904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.797441 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.797979 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lzmjr" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.798246 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.802144 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.802336 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.829524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9vcrj"] Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.995742 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.995787 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxq6d\" (UniqueName: \"kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.995817 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.996604 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.996651 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:47 crc kubenswrapper[4764]: I1204 01:08:47.996767 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098844 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098933 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxq6d\" (UniqueName: \"kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.098949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.103341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.105090 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.105168 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.105388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.107498 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.133055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxq6d\" (UniqueName: \"kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d\") pod \"keystone-bootstrap-9vcrj\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.430706 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.566022 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="256f0007-79b0-4f71-bd5c-015656d7805c" path="/var/lib/kubelet/pods/256f0007-79b0-4f71-bd5c-015656d7805c/volumes" Dec 04 01:08:48 crc kubenswrapper[4764]: I1204 01:08:48.914643 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9vcrj"] Dec 04 01:08:48 crc kubenswrapper[4764]: W1204 01:08:48.922635 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9ea037_924a_490f_ac1b_96d82921c1fb.slice/crio-a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205 WatchSource:0}: Error finding container a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205: Status 404 returned error can't find the container with id a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205 Dec 04 01:08:49 crc kubenswrapper[4764]: I1204 01:08:49.532118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vcrj" event={"ID":"5d9ea037-924a-490f-ac1b-96d82921c1fb","Type":"ContainerStarted","Data":"757fef3a8aa70238a69a8d8d417e4faf39147bdd5ee70299b9af5eb4ea4d2c7e"} Dec 04 01:08:49 crc kubenswrapper[4764]: I1204 01:08:49.532546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vcrj" event={"ID":"5d9ea037-924a-490f-ac1b-96d82921c1fb","Type":"ContainerStarted","Data":"a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205"} Dec 04 01:08:49 crc kubenswrapper[4764]: I1204 01:08:49.551949 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9vcrj" podStartSLOduration=2.551923724 podStartE2EDuration="2.551923724s" podCreationTimestamp="2025-12-04 01:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:49.549446033 +0000 UTC m=+5265.310770494" watchObservedRunningTime="2025-12-04 01:08:49.551923724 +0000 UTC m=+5265.313248175" Dec 04 01:08:50 crc kubenswrapper[4764]: I1204 01:08:50.868615 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:08:50 crc kubenswrapper[4764]: I1204 01:08:50.869009 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:08:50 crc kubenswrapper[4764]: I1204 01:08:50.978957 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.058693 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.065146 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="dnsmasq-dns" containerID="cri-o://0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd" gracePeriod=10 Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.545591 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.550590 4764 generic.go:334] "Generic (PLEG): container finished" podID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerID="0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd" exitCode=0 Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.550628 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" event={"ID":"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7","Type":"ContainerDied","Data":"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd"} Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.550652 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" event={"ID":"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7","Type":"ContainerDied","Data":"0100dd2be4167d3cc7d6a2bdd9fead3fb302edb1cadca005f3ddfd9842bb5e82"} Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.550667 4764 scope.go:117] "RemoveContainer" containerID="0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.550665 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b69c8799-g4kzf" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.585903 4764 scope.go:117] "RemoveContainer" containerID="c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.625558 4764 scope.go:117] "RemoveContainer" containerID="0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd" Dec 04 01:08:51 crc kubenswrapper[4764]: E1204 01:08:51.626082 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd\": container with ID starting with 0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd not found: ID does not exist" containerID="0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.626126 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd"} err="failed to get container status \"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd\": rpc error: code = NotFound desc = could not find container \"0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd\": container with ID starting with 0f1a5c18d7592f49d92d9ccc18d25c19757656a233249f517aae9d871e98dfcd not found: ID does not exist" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.626153 4764 scope.go:117] "RemoveContainer" containerID="c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4" Dec 04 01:08:51 crc kubenswrapper[4764]: E1204 01:08:51.626404 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4\": container with ID starting with c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4 not found: ID does not exist" containerID="c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.626425 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4"} err="failed to get container status \"c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4\": rpc error: code = NotFound desc = could not find container \"c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4\": container with ID starting with c65198a08bc4c0076724dbe849b7795c9a7f2c7afb08f57c76fa1ffe57d2d1a4 not found: ID does not exist" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.665260 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config\") pod \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.665310 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb\") pod \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.665378 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pfjc\" (UniqueName: \"kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc\") pod \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.665442 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc\") pod \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.665467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb\") pod \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\" (UID: \"9cbf8936-8e2d-43b1-8fe2-dd85b68354c7\") " Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.675635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc" (OuterVolumeSpecName: "kube-api-access-9pfjc") pod "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" (UID: "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7"). InnerVolumeSpecName "kube-api-access-9pfjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.703271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" (UID: "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.716181 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" (UID: "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.717230 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" (UID: "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.723564 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config" (OuterVolumeSpecName: "config") pod "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" (UID: "9cbf8936-8e2d-43b1-8fe2-dd85b68354c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.766923 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.766957 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.766970 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pfjc\" (UniqueName: \"kubernetes.io/projected/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-kube-api-access-9pfjc\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.766983 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.766994 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.892558 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:51 crc kubenswrapper[4764]: I1204 01:08:51.900545 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64b69c8799-g4kzf"] Dec 04 01:08:52 crc kubenswrapper[4764]: I1204 01:08:52.558850 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" path="/var/lib/kubelet/pods/9cbf8936-8e2d-43b1-8fe2-dd85b68354c7/volumes" Dec 04 01:08:52 crc kubenswrapper[4764]: I1204 01:08:52.564088 4764 generic.go:334] "Generic (PLEG): container finished" podID="5d9ea037-924a-490f-ac1b-96d82921c1fb" containerID="757fef3a8aa70238a69a8d8d417e4faf39147bdd5ee70299b9af5eb4ea4d2c7e" exitCode=0 Dec 04 01:08:52 crc kubenswrapper[4764]: I1204 01:08:52.564203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vcrj" event={"ID":"5d9ea037-924a-490f-ac1b-96d82921c1fb","Type":"ContainerDied","Data":"757fef3a8aa70238a69a8d8d417e4faf39147bdd5ee70299b9af5eb4ea4d2c7e"} Dec 04 01:08:53 crc kubenswrapper[4764]: I1204 01:08:53.940249 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.107141 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.107601 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.107840 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.108143 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxq6d\" (UniqueName: \"kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.108460 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.108829 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle\") pod \"5d9ea037-924a-490f-ac1b-96d82921c1fb\" (UID: \"5d9ea037-924a-490f-ac1b-96d82921c1fb\") " Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.113814 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.114443 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts" (OuterVolumeSpecName: "scripts") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.119950 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d" (OuterVolumeSpecName: "kube-api-access-lxq6d") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "kube-api-access-lxq6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.127764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.150488 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data" (OuterVolumeSpecName: "config-data") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.155232 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d9ea037-924a-490f-ac1b-96d82921c1fb" (UID: "5d9ea037-924a-490f-ac1b-96d82921c1fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211244 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211277 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211285 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211295 4764 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211303 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxq6d\" (UniqueName: \"kubernetes.io/projected/5d9ea037-924a-490f-ac1b-96d82921c1fb-kube-api-access-lxq6d\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.211311 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9ea037-924a-490f-ac1b-96d82921c1fb-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.590771 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9vcrj" event={"ID":"5d9ea037-924a-490f-ac1b-96d82921c1fb","Type":"ContainerDied","Data":"a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205"} Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.591221 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6275d511fda58d7403590dc9fb6010d07e37d02f4ebc8c93ea0420b52039205" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.591107 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9vcrj" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685073 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dd9cdf66d-j2gg2"] Dec 04 01:08:54 crc kubenswrapper[4764]: E1204 01:08:54.685556 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="dnsmasq-dns" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685585 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="dnsmasq-dns" Dec 04 01:08:54 crc kubenswrapper[4764]: E1204 01:08:54.685605 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="init" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685615 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="init" Dec 04 01:08:54 crc kubenswrapper[4764]: E1204 01:08:54.685641 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9ea037-924a-490f-ac1b-96d82921c1fb" containerName="keystone-bootstrap" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685656 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9ea037-924a-490f-ac1b-96d82921c1fb" containerName="keystone-bootstrap" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685932 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbf8936-8e2d-43b1-8fe2-dd85b68354c7" containerName="dnsmasq-dns" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.685963 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9ea037-924a-490f-ac1b-96d82921c1fb" containerName="keystone-bootstrap" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.686795 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.689286 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.690428 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.690930 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lzmjr" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.691027 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.738673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd9cdf66d-j2gg2"] Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825318 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-scripts\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-fernet-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9q8v\" (UniqueName: \"kubernetes.io/projected/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-kube-api-access-k9q8v\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-combined-ca-bundle\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825588 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-credential-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.825612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-config-data\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927007 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-combined-ca-bundle\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927050 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-credential-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-config-data\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927163 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-scripts\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927195 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-fernet-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.927215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9q8v\" (UniqueName: \"kubernetes.io/projected/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-kube-api-access-k9q8v\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.931426 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-fernet-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.931783 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-credential-keys\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.931983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-combined-ca-bundle\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.932976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-config-data\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.934945 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-scripts\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:54 crc kubenswrapper[4764]: I1204 01:08:54.950125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9q8v\" (UniqueName: \"kubernetes.io/projected/c9f24c8f-e68f-4397-8c51-94d78d3cbe83-kube-api-access-k9q8v\") pod \"keystone-dd9cdf66d-j2gg2\" (UID: \"c9f24c8f-e68f-4397-8c51-94d78d3cbe83\") " pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:55 crc kubenswrapper[4764]: I1204 01:08:55.022810 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:55 crc kubenswrapper[4764]: I1204 01:08:55.498071 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dd9cdf66d-j2gg2"] Dec 04 01:08:55 crc kubenswrapper[4764]: W1204 01:08:55.498083 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9f24c8f_e68f_4397_8c51_94d78d3cbe83.slice/crio-765d93ecb49c0051f936666638c477b01ec5d041f9852fce24b0818d3ee3e7a6 WatchSource:0}: Error finding container 765d93ecb49c0051f936666638c477b01ec5d041f9852fce24b0818d3ee3e7a6: Status 404 returned error can't find the container with id 765d93ecb49c0051f936666638c477b01ec5d041f9852fce24b0818d3ee3e7a6 Dec 04 01:08:55 crc kubenswrapper[4764]: I1204 01:08:55.599987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd9cdf66d-j2gg2" event={"ID":"c9f24c8f-e68f-4397-8c51-94d78d3cbe83","Type":"ContainerStarted","Data":"765d93ecb49c0051f936666638c477b01ec5d041f9852fce24b0818d3ee3e7a6"} Dec 04 01:08:56 crc kubenswrapper[4764]: I1204 01:08:56.610846 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dd9cdf66d-j2gg2" event={"ID":"c9f24c8f-e68f-4397-8c51-94d78d3cbe83","Type":"ContainerStarted","Data":"5ef1dfa77fabcf96f759829a5a8009822e5051d3ec71b2c5f756f6986fe57f1a"} Dec 04 01:08:56 crc kubenswrapper[4764]: I1204 01:08:56.611235 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:08:56 crc kubenswrapper[4764]: I1204 01:08:56.643309 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-dd9cdf66d-j2gg2" podStartSLOduration=2.643274676 podStartE2EDuration="2.643274676s" podCreationTimestamp="2025-12-04 01:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:08:56.624462922 +0000 UTC m=+5272.385787353" watchObservedRunningTime="2025-12-04 01:08:56.643274676 +0000 UTC m=+5272.404599127" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.497029 4764 scope.go:117] "RemoveContainer" containerID="f0205ce0ff190f5c7c123e36859f71da3fd06e06a9f6d74071470f50ae17ed84" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.516909 4764 scope.go:117] "RemoveContainer" containerID="908f8882c13fa4ddd89baecf281671248a003598034ff7da8bf7e2c50f753f3b" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.560119 4764 scope.go:117] "RemoveContainer" containerID="28ebd4cdf85bce048ab2c5ee28648941d0806d8c732311ca155d5cccaef0ecef" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.592961 4764 scope.go:117] "RemoveContainer" containerID="f855c84a45f3eec1b4c42f662fe44f778e141827a5519202687e2577db5bb614" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.632225 4764 scope.go:117] "RemoveContainer" containerID="22e38ce74dfe448caac44360669f607fb0bf2641aadafa2f34574f3b12140511" Dec 04 01:09:10 crc kubenswrapper[4764]: I1204 01:09:10.665159 4764 scope.go:117] "RemoveContainer" containerID="d8f9c8a8d8cb9e51294cefd1e2a2697e5b8e5008a429d09d07e46c613616d2a9" Dec 04 01:09:20 crc kubenswrapper[4764]: I1204 01:09:20.869638 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:09:20 crc kubenswrapper[4764]: I1204 01:09:20.870319 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:09:20 crc kubenswrapper[4764]: I1204 01:09:20.870382 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:09:20 crc kubenswrapper[4764]: I1204 01:09:20.871392 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:09:20 crc kubenswrapper[4764]: I1204 01:09:20.871486 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" gracePeriod=600 Dec 04 01:09:21 crc kubenswrapper[4764]: E1204 01:09:21.004829 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:09:21 crc kubenswrapper[4764]: I1204 01:09:21.876391 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" exitCode=0 Dec 04 01:09:21 crc kubenswrapper[4764]: I1204 01:09:21.876448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d"} Dec 04 01:09:21 crc kubenswrapper[4764]: I1204 01:09:21.876518 4764 scope.go:117] "RemoveContainer" containerID="b3e111b765473560b18e521875a6b90a1db7680da6cb7c351e1d070319575147" Dec 04 01:09:21 crc kubenswrapper[4764]: I1204 01:09:21.877012 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:09:21 crc kubenswrapper[4764]: E1204 01:09:21.877226 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:09:26 crc kubenswrapper[4764]: I1204 01:09:26.559299 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-dd9cdf66d-j2gg2" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.598353 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.599963 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.602408 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.602434 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.602641 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fbxnp" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.608627 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.693571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.693666 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.693851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwbk4\" (UniqueName: \"kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.795019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.795134 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwbk4\" (UniqueName: \"kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.795170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.795998 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.801161 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.811538 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwbk4\" (UniqueName: \"kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4\") pod \"openstackclient\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " pod="openstack/openstackclient" Dec 04 01:09:29 crc kubenswrapper[4764]: I1204 01:09:29.919447 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:09:30 crc kubenswrapper[4764]: I1204 01:09:30.345134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 01:09:30 crc kubenswrapper[4764]: I1204 01:09:30.953615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9492ba61-0ca6-433e-9eac-9819f2f0ff4c","Type":"ContainerStarted","Data":"92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123"} Dec 04 01:09:30 crc kubenswrapper[4764]: I1204 01:09:30.954034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9492ba61-0ca6-433e-9eac-9819f2f0ff4c","Type":"ContainerStarted","Data":"b7be06b6bbc070b2786333dc1cce6852517dda210e4ce1cb2ed91a114145bf28"} Dec 04 01:09:30 crc kubenswrapper[4764]: I1204 01:09:30.978426 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.978401822 podStartE2EDuration="1.978401822s" podCreationTimestamp="2025-12-04 01:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:09:30.974899735 +0000 UTC m=+5306.736224186" watchObservedRunningTime="2025-12-04 01:09:30.978401822 +0000 UTC m=+5306.739726243" Dec 04 01:09:35 crc kubenswrapper[4764]: I1204 01:09:35.545619 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:09:35 crc kubenswrapper[4764]: E1204 01:09:35.546309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:09:48 crc kubenswrapper[4764]: I1204 01:09:48.546404 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:09:48 crc kubenswrapper[4764]: E1204 01:09:48.547216 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:10:01 crc kubenswrapper[4764]: I1204 01:10:01.546442 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:10:01 crc kubenswrapper[4764]: E1204 01:10:01.547607 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:10:10 crc kubenswrapper[4764]: I1204 01:10:10.851038 4764 scope.go:117] "RemoveContainer" containerID="62a380c714e439dbe38ee9cef470b3ae135a7b7b820b683930771039d93cd660" Dec 04 01:10:10 crc kubenswrapper[4764]: I1204 01:10:10.895502 4764 scope.go:117] "RemoveContainer" containerID="3e3b3228d7c0b06eae8fb8c71e5d0679baa49e0a4308994d466d9095bf93530d" Dec 04 01:10:16 crc kubenswrapper[4764]: I1204 01:10:16.546380 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:10:16 crc kubenswrapper[4764]: E1204 01:10:16.547485 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:10:27 crc kubenswrapper[4764]: E1204 01:10:27.561377 4764 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:46390->38.102.83.13:39483: write tcp 38.102.83.13:46390->38.102.83.13:39483: write: broken pipe Dec 04 01:10:31 crc kubenswrapper[4764]: I1204 01:10:31.545905 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:10:31 crc kubenswrapper[4764]: E1204 01:10:31.547031 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:10:46 crc kubenswrapper[4764]: I1204 01:10:46.548601 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:10:46 crc kubenswrapper[4764]: E1204 01:10:46.554158 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:10:58 crc kubenswrapper[4764]: I1204 01:10:58.545855 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:10:58 crc kubenswrapper[4764]: E1204 01:10:58.546786 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.342688 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ae2c-account-create-update-8zmdj"] Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.344681 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.346438 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.358606 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f5f65"] Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.360171 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.367315 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ae2c-account-create-update-8zmdj"] Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.379591 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5f65"] Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.526300 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsm8c\" (UniqueName: \"kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.526373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.526405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85q9w\" (UniqueName: \"kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.526437 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.628405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsm8c\" (UniqueName: \"kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.628515 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.628559 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85q9w\" (UniqueName: \"kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.628602 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.629789 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.629879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.656990 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsm8c\" (UniqueName: \"kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c\") pod \"barbican-ae2c-account-create-update-8zmdj\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.658192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85q9w\" (UniqueName: \"kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w\") pod \"barbican-db-create-f5f65\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.664175 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:07 crc kubenswrapper[4764]: I1204 01:11:07.681842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.169793 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ae2c-account-create-update-8zmdj"] Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.264072 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f5f65"] Dec 04 01:11:08 crc kubenswrapper[4764]: W1204 01:11:08.274447 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56aaaa63_fc9e_4be4_b59e_94a58c3d871b.slice/crio-8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a WatchSource:0}: Error finding container 8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a: Status 404 returned error can't find the container with id 8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.980932 4764 generic.go:334] "Generic (PLEG): container finished" podID="700c9327-890d-4ea4-bb92-c9542c0de314" containerID="3ca723cd8246c66c3f9f13e12aa5a610c17ada4d61700922ff2edb7f2e03c16b" exitCode=0 Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.981054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae2c-account-create-update-8zmdj" event={"ID":"700c9327-890d-4ea4-bb92-c9542c0de314","Type":"ContainerDied","Data":"3ca723cd8246c66c3f9f13e12aa5a610c17ada4d61700922ff2edb7f2e03c16b"} Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.981096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae2c-account-create-update-8zmdj" event={"ID":"700c9327-890d-4ea4-bb92-c9542c0de314","Type":"ContainerStarted","Data":"f89d3a93f975d9999f94a6a09b1679510d95c693e1c53d8cb5ab8350016cba96"} Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.987079 4764 generic.go:334] "Generic (PLEG): container finished" podID="56aaaa63-fc9e-4be4-b59e-94a58c3d871b" containerID="d55e11f8bf69f614d485584fc82152577855c4a5c9e99880a53a8e1e7b21b863" exitCode=0 Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.987120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f65" event={"ID":"56aaaa63-fc9e-4be4-b59e-94a58c3d871b","Type":"ContainerDied","Data":"d55e11f8bf69f614d485584fc82152577855c4a5c9e99880a53a8e1e7b21b863"} Dec 04 01:11:08 crc kubenswrapper[4764]: I1204 01:11:08.987153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f65" event={"ID":"56aaaa63-fc9e-4be4-b59e-94a58c3d871b","Type":"ContainerStarted","Data":"8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a"} Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.432098 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.437632 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.583436 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts\") pod \"700c9327-890d-4ea4-bb92-c9542c0de314\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.583533 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsm8c\" (UniqueName: \"kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c\") pod \"700c9327-890d-4ea4-bb92-c9542c0de314\" (UID: \"700c9327-890d-4ea4-bb92-c9542c0de314\") " Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.583799 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85q9w\" (UniqueName: \"kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w\") pod \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.583863 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts\") pod \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\" (UID: \"56aaaa63-fc9e-4be4-b59e-94a58c3d871b\") " Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.584275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "700c9327-890d-4ea4-bb92-c9542c0de314" (UID: "700c9327-890d-4ea4-bb92-c9542c0de314"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.584393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56aaaa63-fc9e-4be4-b59e-94a58c3d871b" (UID: "56aaaa63-fc9e-4be4-b59e-94a58c3d871b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.584680 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.584707 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700c9327-890d-4ea4-bb92-c9542c0de314-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.593966 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w" (OuterVolumeSpecName: "kube-api-access-85q9w") pod "56aaaa63-fc9e-4be4-b59e-94a58c3d871b" (UID: "56aaaa63-fc9e-4be4-b59e-94a58c3d871b"). InnerVolumeSpecName "kube-api-access-85q9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.594017 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c" (OuterVolumeSpecName: "kube-api-access-qsm8c") pod "700c9327-890d-4ea4-bb92-c9542c0de314" (UID: "700c9327-890d-4ea4-bb92-c9542c0de314"). InnerVolumeSpecName "kube-api-access-qsm8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.686582 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85q9w\" (UniqueName: \"kubernetes.io/projected/56aaaa63-fc9e-4be4-b59e-94a58c3d871b-kube-api-access-85q9w\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:10 crc kubenswrapper[4764]: I1204 01:11:10.686621 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsm8c\" (UniqueName: \"kubernetes.io/projected/700c9327-890d-4ea4-bb92-c9542c0de314-kube-api-access-qsm8c\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.008617 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ae2c-account-create-update-8zmdj" Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.008644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ae2c-account-create-update-8zmdj" event={"ID":"700c9327-890d-4ea4-bb92-c9542c0de314","Type":"ContainerDied","Data":"f89d3a93f975d9999f94a6a09b1679510d95c693e1c53d8cb5ab8350016cba96"} Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.008757 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f89d3a93f975d9999f94a6a09b1679510d95c693e1c53d8cb5ab8350016cba96" Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.011363 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f5f65" event={"ID":"56aaaa63-fc9e-4be4-b59e-94a58c3d871b","Type":"ContainerDied","Data":"8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a"} Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.011401 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8349d06fb6f4c9fd5434686df817833e5bc5c36ab105c852e8da2169669ded1a" Dec 04 01:11:11 crc kubenswrapper[4764]: I1204 01:11:11.011466 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f5f65" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.545908 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:11:12 crc kubenswrapper[4764]: E1204 01:11:12.546209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.614632 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-5gq6j"] Dec 04 01:11:12 crc kubenswrapper[4764]: E1204 01:11:12.614945 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700c9327-890d-4ea4-bb92-c9542c0de314" containerName="mariadb-account-create-update" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.614962 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="700c9327-890d-4ea4-bb92-c9542c0de314" containerName="mariadb-account-create-update" Dec 04 01:11:12 crc kubenswrapper[4764]: E1204 01:11:12.614988 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56aaaa63-fc9e-4be4-b59e-94a58c3d871b" containerName="mariadb-database-create" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.614994 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="56aaaa63-fc9e-4be4-b59e-94a58c3d871b" containerName="mariadb-database-create" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.615141 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="56aaaa63-fc9e-4be4-b59e-94a58c3d871b" containerName="mariadb-database-create" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.615157 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="700c9327-890d-4ea4-bb92-c9542c0de314" containerName="mariadb-account-create-update" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.615657 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.634929 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5gq6j"] Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.635334 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.635555 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xslx8" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.721566 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.721818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.721881 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.823170 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.823296 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.823392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.828667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.844797 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.854450 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp\") pod \"barbican-db-sync-5gq6j\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:12 crc kubenswrapper[4764]: I1204 01:11:12.939760 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:13 crc kubenswrapper[4764]: I1204 01:11:13.500231 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-5gq6j"] Dec 04 01:11:13 crc kubenswrapper[4764]: W1204 01:11:13.507230 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44374b57_55ca_4eb6_acd9_34eea12e4f86.slice/crio-662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7 WatchSource:0}: Error finding container 662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7: Status 404 returned error can't find the container with id 662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7 Dec 04 01:11:14 crc kubenswrapper[4764]: I1204 01:11:14.042636 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5gq6j" event={"ID":"44374b57-55ca-4eb6-acd9-34eea12e4f86","Type":"ContainerStarted","Data":"23107788f3ccdd79607654beba94204f0532168a837c22039178059e2326a4f2"} Dec 04 01:11:14 crc kubenswrapper[4764]: I1204 01:11:14.042712 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5gq6j" event={"ID":"44374b57-55ca-4eb6-acd9-34eea12e4f86","Type":"ContainerStarted","Data":"662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7"} Dec 04 01:11:14 crc kubenswrapper[4764]: I1204 01:11:14.068933 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-5gq6j" podStartSLOduration=2.068917477 podStartE2EDuration="2.068917477s" podCreationTimestamp="2025-12-04 01:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:14.0605288 +0000 UTC m=+5409.821853231" watchObservedRunningTime="2025-12-04 01:11:14.068917477 +0000 UTC m=+5409.830241888" Dec 04 01:11:15 crc kubenswrapper[4764]: I1204 01:11:15.058892 4764 generic.go:334] "Generic (PLEG): container finished" podID="44374b57-55ca-4eb6-acd9-34eea12e4f86" containerID="23107788f3ccdd79607654beba94204f0532168a837c22039178059e2326a4f2" exitCode=0 Dec 04 01:11:15 crc kubenswrapper[4764]: I1204 01:11:15.058988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5gq6j" event={"ID":"44374b57-55ca-4eb6-acd9-34eea12e4f86","Type":"ContainerDied","Data":"23107788f3ccdd79607654beba94204f0532168a837c22039178059e2326a4f2"} Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.406431 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.429171 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp\") pod \"44374b57-55ca-4eb6-acd9-34eea12e4f86\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.429264 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data\") pod \"44374b57-55ca-4eb6-acd9-34eea12e4f86\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.429662 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle\") pod \"44374b57-55ca-4eb6-acd9-34eea12e4f86\" (UID: \"44374b57-55ca-4eb6-acd9-34eea12e4f86\") " Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.435044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp" (OuterVolumeSpecName: "kube-api-access-tgbvp") pod "44374b57-55ca-4eb6-acd9-34eea12e4f86" (UID: "44374b57-55ca-4eb6-acd9-34eea12e4f86"). InnerVolumeSpecName "kube-api-access-tgbvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.435906 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "44374b57-55ca-4eb6-acd9-34eea12e4f86" (UID: "44374b57-55ca-4eb6-acd9-34eea12e4f86"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.452953 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44374b57-55ca-4eb6-acd9-34eea12e4f86" (UID: "44374b57-55ca-4eb6-acd9-34eea12e4f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.532656 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.532701 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbvp\" (UniqueName: \"kubernetes.io/projected/44374b57-55ca-4eb6-acd9-34eea12e4f86-kube-api-access-tgbvp\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:16 crc kubenswrapper[4764]: I1204 01:11:16.532748 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44374b57-55ca-4eb6-acd9-34eea12e4f86-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.080179 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-5gq6j" event={"ID":"44374b57-55ca-4eb6-acd9-34eea12e4f86","Type":"ContainerDied","Data":"662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7"} Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.080237 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="662b43c9a202c8941edfeaa788c0eb1bfaa6c5314e9cbf868756ca06de1ff9b7" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.080319 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-5gq6j" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.351130 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-686884878f-7crbh"] Dec 04 01:11:17 crc kubenswrapper[4764]: E1204 01:11:17.351979 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44374b57-55ca-4eb6-acd9-34eea12e4f86" containerName="barbican-db-sync" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.352002 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="44374b57-55ca-4eb6-acd9-34eea12e4f86" containerName="barbican-db-sync" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.352320 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="44374b57-55ca-4eb6-acd9-34eea12e4f86" containerName="barbican-db-sync" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.353472 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.367510 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.368569 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-xslx8" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.368806 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.385173 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-686884878f-7crbh"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.405140 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d646cdddb-jwkg4"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.406670 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.409041 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449560 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-logs\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449599 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data-custom\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449623 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449672 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449695 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data-custom\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449727 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-combined-ca-bundle\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449761 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzlzg\" (UniqueName: \"kubernetes.io/projected/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-kube-api-access-pzlzg\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d646cdddb-jwkg4"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449792 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/623ac352-8e8d-4baa-b526-01cdaa99302b-logs\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.449978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-combined-ca-bundle\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.450014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxkb8\" (UniqueName: \"kubernetes.io/projected/623ac352-8e8d-4baa-b526-01cdaa99302b-kube-api-access-wxkb8\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.470781 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.472675 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.478502 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.543930 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57d69cf6fb-qrpxb"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.545193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.547677 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.550954 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.550989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-combined-ca-bundle\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxkb8\" (UniqueName: \"kubernetes.io/projected/623ac352-8e8d-4baa-b526-01cdaa99302b-kube-api-access-wxkb8\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551048 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-logs\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data-custom\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551166 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data-custom\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551185 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-combined-ca-bundle\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551215 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzlzg\" (UniqueName: \"kubernetes.io/projected/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-kube-api-access-pzlzg\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551235 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7gn\" (UniqueName: \"kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/623ac352-8e8d-4baa-b526-01cdaa99302b-logs\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.551302 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.553657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-logs\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.554518 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/623ac352-8e8d-4baa-b526-01cdaa99302b-logs\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.557549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.559834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-combined-ca-bundle\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.576107 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data-custom\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.579024 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzlzg\" (UniqueName: \"kubernetes.io/projected/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-kube-api-access-pzlzg\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.585204 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57d69cf6fb-qrpxb"] Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.587634 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/623ac352-8e8d-4baa-b526-01cdaa99302b-config-data\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.591121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-combined-ca-bundle\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.594005 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2-config-data-custom\") pod \"barbican-keystone-listener-7d646cdddb-jwkg4\" (UID: \"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2\") " pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.596603 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxkb8\" (UniqueName: \"kubernetes.io/projected/623ac352-8e8d-4baa-b526-01cdaa99302b-kube-api-access-wxkb8\") pod \"barbican-worker-686884878f-7crbh\" (UID: \"623ac352-8e8d-4baa-b526-01cdaa99302b\") " pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwjw\" (UniqueName: \"kubernetes.io/projected/c79a42ff-9710-4e85-9572-b5ef52f182c9-kube-api-access-jnwjw\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-combined-ca-bundle\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656292 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7gn\" (UniqueName: \"kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656453 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data-custom\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.656960 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.657030 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c79a42ff-9710-4e85-9572-b5ef52f182c9-logs\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.658021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.658221 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.659322 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.660324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.673706 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7gn\" (UniqueName: \"kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn\") pod \"dnsmasq-dns-5f76578655-fgp47\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.703005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-686884878f-7crbh" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.734357 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c79a42ff-9710-4e85-9572-b5ef52f182c9-logs\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwjw\" (UniqueName: \"kubernetes.io/projected/c79a42ff-9710-4e85-9572-b5ef52f182c9-kube-api-access-jnwjw\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c79a42ff-9710-4e85-9572-b5ef52f182c9-logs\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758817 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-combined-ca-bundle\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758848 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.758905 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data-custom\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.762759 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data-custom\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.763203 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-combined-ca-bundle\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.765029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c79a42ff-9710-4e85-9572-b5ef52f182c9-config-data\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.776400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwjw\" (UniqueName: \"kubernetes.io/projected/c79a42ff-9710-4e85-9572-b5ef52f182c9-kube-api-access-jnwjw\") pod \"barbican-api-57d69cf6fb-qrpxb\" (UID: \"c79a42ff-9710-4e85-9572-b5ef52f182c9\") " pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.801591 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:17 crc kubenswrapper[4764]: I1204 01:11:17.964862 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:18 crc kubenswrapper[4764]: I1204 01:11:18.189461 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-686884878f-7crbh"] Dec 04 01:11:18 crc kubenswrapper[4764]: W1204 01:11:18.190499 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623ac352_8e8d_4baa_b526_01cdaa99302b.slice/crio-77e5b431ece50f513badf33211135a5895c2c79f247f0a6630ee9ea00b24451d WatchSource:0}: Error finding container 77e5b431ece50f513badf33211135a5895c2c79f247f0a6630ee9ea00b24451d: Status 404 returned error can't find the container with id 77e5b431ece50f513badf33211135a5895c2c79f247f0a6630ee9ea00b24451d Dec 04 01:11:18 crc kubenswrapper[4764]: I1204 01:11:18.275865 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d646cdddb-jwkg4"] Dec 04 01:11:18 crc kubenswrapper[4764]: W1204 01:11:18.286297 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6684ffcd_bdd3_4b3f_98fd_1f2adf292cf2.slice/crio-c71841bc1f88f5e5dde859f2a4cc74966563f7eb2dfed4d919f1588fe0183a50 WatchSource:0}: Error finding container c71841bc1f88f5e5dde859f2a4cc74966563f7eb2dfed4d919f1588fe0183a50: Status 404 returned error can't find the container with id c71841bc1f88f5e5dde859f2a4cc74966563f7eb2dfed4d919f1588fe0183a50 Dec 04 01:11:18 crc kubenswrapper[4764]: W1204 01:11:18.354855 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4eafd88_870a_463d_9138_01c26058cc32.slice/crio-db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6 WatchSource:0}: Error finding container db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6: Status 404 returned error can't find the container with id db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6 Dec 04 01:11:18 crc kubenswrapper[4764]: I1204 01:11:18.359195 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:11:18 crc kubenswrapper[4764]: I1204 01:11:18.439281 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57d69cf6fb-qrpxb"] Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.104289 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d69cf6fb-qrpxb" event={"ID":"c79a42ff-9710-4e85-9572-b5ef52f182c9","Type":"ContainerStarted","Data":"4d44f7e7a4a7de0280c0cccc285d67a79c6f7c2794f4301b9c5b80e35f020d8a"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.104877 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.105048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d69cf6fb-qrpxb" event={"ID":"c79a42ff-9710-4e85-9572-b5ef52f182c9","Type":"ContainerStarted","Data":"c558a276f7eb0daa390c7a9fcbf476284cd4db31630585ea2dbfd0702bf4d02d"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.105122 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57d69cf6fb-qrpxb" event={"ID":"c79a42ff-9710-4e85-9572-b5ef52f182c9","Type":"ContainerStarted","Data":"dc7d92a60abe212d3151ad0fe658fe5eb400e52f79662131ed3d3eba1cf1f3c2"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.105190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.106700 4764 generic.go:334] "Generic (PLEG): container finished" podID="c4eafd88-870a-463d-9138-01c26058cc32" containerID="391526f996b57838bdc9ff852aa537feedc02931bf5a91b341b965e68aca617a" exitCode=0 Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.106899 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f76578655-fgp47" event={"ID":"c4eafd88-870a-463d-9138-01c26058cc32","Type":"ContainerDied","Data":"391526f996b57838bdc9ff852aa537feedc02931bf5a91b341b965e68aca617a"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.106941 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f76578655-fgp47" event={"ID":"c4eafd88-870a-463d-9138-01c26058cc32","Type":"ContainerStarted","Data":"db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.109575 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" event={"ID":"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2","Type":"ContainerStarted","Data":"a898a7abd64e94fe8c9ee5c9adc934b9010a80a33c6a86f242c0ab0d713384a5"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.109618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" event={"ID":"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2","Type":"ContainerStarted","Data":"0303377c8280e1880d1abd9f2b3260ed9b636626e6fdf726371952a00304cca7"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.109633 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" event={"ID":"6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2","Type":"ContainerStarted","Data":"c71841bc1f88f5e5dde859f2a4cc74966563f7eb2dfed4d919f1588fe0183a50"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.112258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-686884878f-7crbh" event={"ID":"623ac352-8e8d-4baa-b526-01cdaa99302b","Type":"ContainerStarted","Data":"b9961b93bd2c2dbce94dfa2e8117119857ce735d2debcbc14f55a7bf2948b47f"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.112295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-686884878f-7crbh" event={"ID":"623ac352-8e8d-4baa-b526-01cdaa99302b","Type":"ContainerStarted","Data":"bd925e702c1ba08ab3c8f3e9c2ac5e9ac08103e113ca89f634281b18bfae53df"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.112307 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-686884878f-7crbh" event={"ID":"623ac352-8e8d-4baa-b526-01cdaa99302b","Type":"ContainerStarted","Data":"77e5b431ece50f513badf33211135a5895c2c79f247f0a6630ee9ea00b24451d"} Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.128300 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57d69cf6fb-qrpxb" podStartSLOduration=2.128281373 podStartE2EDuration="2.128281373s" podCreationTimestamp="2025-12-04 01:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:19.124136381 +0000 UTC m=+5414.885460792" watchObservedRunningTime="2025-12-04 01:11:19.128281373 +0000 UTC m=+5414.889605784" Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.175214 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d646cdddb-jwkg4" podStartSLOduration=2.175196859 podStartE2EDuration="2.175196859s" podCreationTimestamp="2025-12-04 01:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:19.162103977 +0000 UTC m=+5414.923428388" watchObservedRunningTime="2025-12-04 01:11:19.175196859 +0000 UTC m=+5414.936521270" Dec 04 01:11:19 crc kubenswrapper[4764]: I1204 01:11:19.186629 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-686884878f-7crbh" podStartSLOduration=2.1866077 podStartE2EDuration="2.1866077s" podCreationTimestamp="2025-12-04 01:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:19.182258333 +0000 UTC m=+5414.943582744" watchObservedRunningTime="2025-12-04 01:11:19.1866077 +0000 UTC m=+5414.947932111" Dec 04 01:11:20 crc kubenswrapper[4764]: I1204 01:11:20.124618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f76578655-fgp47" event={"ID":"c4eafd88-870a-463d-9138-01c26058cc32","Type":"ContainerStarted","Data":"d07bcc7f5d09efdcce95c3ac207c5406706f295647eedbe6721c166617cee1e9"} Dec 04 01:11:20 crc kubenswrapper[4764]: I1204 01:11:20.126327 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:20 crc kubenswrapper[4764]: I1204 01:11:20.162657 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f76578655-fgp47" podStartSLOduration=3.162638827 podStartE2EDuration="3.162638827s" podCreationTimestamp="2025-12-04 01:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:20.154084087 +0000 UTC m=+5415.915408538" watchObservedRunningTime="2025-12-04 01:11:20.162638827 +0000 UTC m=+5415.923963248" Dec 04 01:11:24 crc kubenswrapper[4764]: I1204 01:11:24.556659 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:11:24 crc kubenswrapper[4764]: E1204 01:11:24.557821 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:11:27 crc kubenswrapper[4764]: I1204 01:11:27.803982 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:11:27 crc kubenswrapper[4764]: I1204 01:11:27.886238 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:11:27 crc kubenswrapper[4764]: I1204 01:11:27.886965 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="dnsmasq-dns" containerID="cri-o://6f22abcb9c2914f432fb55c45c2999cb1f1716a519fe60db84fa4d49f779a659" gracePeriod=10 Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.205215 4764 generic.go:334] "Generic (PLEG): container finished" podID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerID="6f22abcb9c2914f432fb55c45c2999cb1f1716a519fe60db84fa4d49f779a659" exitCode=0 Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.205306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" event={"ID":"d178584a-4dc5-4161-8bd0-c55440c48f22","Type":"ContainerDied","Data":"6f22abcb9c2914f432fb55c45c2999cb1f1716a519fe60db84fa4d49f779a659"} Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.370990 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.492881 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzp67\" (UniqueName: \"kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67\") pod \"d178584a-4dc5-4161-8bd0-c55440c48f22\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.492938 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb\") pod \"d178584a-4dc5-4161-8bd0-c55440c48f22\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.493011 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config\") pod \"d178584a-4dc5-4161-8bd0-c55440c48f22\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.493045 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc\") pod \"d178584a-4dc5-4161-8bd0-c55440c48f22\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.493187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb\") pod \"d178584a-4dc5-4161-8bd0-c55440c48f22\" (UID: \"d178584a-4dc5-4161-8bd0-c55440c48f22\") " Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.510447 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67" (OuterVolumeSpecName: "kube-api-access-wzp67") pod "d178584a-4dc5-4161-8bd0-c55440c48f22" (UID: "d178584a-4dc5-4161-8bd0-c55440c48f22"). InnerVolumeSpecName "kube-api-access-wzp67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.531921 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d178584a-4dc5-4161-8bd0-c55440c48f22" (UID: "d178584a-4dc5-4161-8bd0-c55440c48f22"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.538282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d178584a-4dc5-4161-8bd0-c55440c48f22" (UID: "d178584a-4dc5-4161-8bd0-c55440c48f22"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.563259 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d178584a-4dc5-4161-8bd0-c55440c48f22" (UID: "d178584a-4dc5-4161-8bd0-c55440c48f22"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.563513 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config" (OuterVolumeSpecName: "config") pod "d178584a-4dc5-4161-8bd0-c55440c48f22" (UID: "d178584a-4dc5-4161-8bd0-c55440c48f22"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.594665 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzp67\" (UniqueName: \"kubernetes.io/projected/d178584a-4dc5-4161-8bd0-c55440c48f22-kube-api-access-wzp67\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.594695 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.594709 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.594734 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:28 crc kubenswrapper[4764]: I1204 01:11:28.594744 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d178584a-4dc5-4161-8bd0-c55440c48f22-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.218332 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" event={"ID":"d178584a-4dc5-4161-8bd0-c55440c48f22","Type":"ContainerDied","Data":"6b2dfa295b387f9816f710c2f5663af394a9459238020d1d376fb4fc75f78d37"} Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.218452 4764 scope.go:117] "RemoveContainer" containerID="6f22abcb9c2914f432fb55c45c2999cb1f1716a519fe60db84fa4d49f779a659" Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.218466 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb8758895-5pcsh" Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.243644 4764 scope.go:117] "RemoveContainer" containerID="8b5214240777222dca4fc6835fb0405b4652f16ed9c3c611628e44553bfb098c" Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.281029 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.291648 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb8758895-5pcsh"] Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.379148 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:29 crc kubenswrapper[4764]: I1204 01:11:29.401910 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57d69cf6fb-qrpxb" Dec 04 01:11:30 crc kubenswrapper[4764]: I1204 01:11:30.565540 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" path="/var/lib/kubelet/pods/d178584a-4dc5-4161-8bd0-c55440c48f22/volumes" Dec 04 01:11:39 crc kubenswrapper[4764]: I1204 01:11:39.546281 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:11:39 crc kubenswrapper[4764]: E1204 01:11:39.547477 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.289029 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-t6zlw"] Dec 04 01:11:41 crc kubenswrapper[4764]: E1204 01:11:41.289761 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="init" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.289779 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="init" Dec 04 01:11:41 crc kubenswrapper[4764]: E1204 01:11:41.289804 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="dnsmasq-dns" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.289810 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="dnsmasq-dns" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.289954 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d178584a-4dc5-4161-8bd0-c55440c48f22" containerName="dnsmasq-dns" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.290484 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.306283 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t6zlw"] Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.349791 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.349879 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjqm6\" (UniqueName: \"kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.389741 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5469-account-create-update-9w7d2"] Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.390874 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.393443 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.406369 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5469-account-create-update-9w7d2"] Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.451926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.451985 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjqm6\" (UniqueName: \"kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.452063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczwm\" (UniqueName: \"kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.452126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.452790 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.469707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjqm6\" (UniqueName: \"kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6\") pod \"neutron-db-create-t6zlw\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.553846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bczwm\" (UniqueName: \"kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.554212 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.554853 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.571489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczwm\" (UniqueName: \"kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm\") pod \"neutron-5469-account-create-update-9w7d2\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.606997 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:41 crc kubenswrapper[4764]: I1204 01:11:41.708451 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.080409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-t6zlw"] Dec 04 01:11:42 crc kubenswrapper[4764]: W1204 01:11:42.081266 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8422e89a_26b9_4543_b999_720bf0cf7224.slice/crio-90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6 WatchSource:0}: Error finding container 90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6: Status 404 returned error can't find the container with id 90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6 Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.192644 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5469-account-create-update-9w7d2"] Dec 04 01:11:42 crc kubenswrapper[4764]: W1204 01:11:42.198539 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod530b9e0b_90f2_4b27_9109_d59dcdfd8c71.slice/crio-14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e WatchSource:0}: Error finding container 14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e: Status 404 returned error can't find the container with id 14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.346918 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6zlw" event={"ID":"8422e89a-26b9-4543-b999-720bf0cf7224","Type":"ContainerStarted","Data":"85581f794732e9d01529d6ac7396d77e57128113f5fe3cacd69c89ccc8387790"} Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.346969 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6zlw" event={"ID":"8422e89a-26b9-4543-b999-720bf0cf7224","Type":"ContainerStarted","Data":"90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6"} Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.351149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5469-account-create-update-9w7d2" event={"ID":"530b9e0b-90f2-4b27-9109-d59dcdfd8c71","Type":"ContainerStarted","Data":"14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e"} Dec 04 01:11:42 crc kubenswrapper[4764]: I1204 01:11:42.367330 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-t6zlw" podStartSLOduration=1.3673120060000001 podStartE2EDuration="1.367312006s" podCreationTimestamp="2025-12-04 01:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:42.366128907 +0000 UTC m=+5438.127453328" watchObservedRunningTime="2025-12-04 01:11:42.367312006 +0000 UTC m=+5438.128636417" Dec 04 01:11:43 crc kubenswrapper[4764]: I1204 01:11:43.364932 4764 generic.go:334] "Generic (PLEG): container finished" podID="8422e89a-26b9-4543-b999-720bf0cf7224" containerID="85581f794732e9d01529d6ac7396d77e57128113f5fe3cacd69c89ccc8387790" exitCode=0 Dec 04 01:11:43 crc kubenswrapper[4764]: I1204 01:11:43.365208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6zlw" event={"ID":"8422e89a-26b9-4543-b999-720bf0cf7224","Type":"ContainerDied","Data":"85581f794732e9d01529d6ac7396d77e57128113f5fe3cacd69c89ccc8387790"} Dec 04 01:11:43 crc kubenswrapper[4764]: I1204 01:11:43.368298 4764 generic.go:334] "Generic (PLEG): container finished" podID="530b9e0b-90f2-4b27-9109-d59dcdfd8c71" containerID="d9b7406e5ae6b6e6827d3e42930f6d230ba4dff071daed3c932b57616bc01164" exitCode=0 Dec 04 01:11:43 crc kubenswrapper[4764]: I1204 01:11:43.368362 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5469-account-create-update-9w7d2" event={"ID":"530b9e0b-90f2-4b27-9109-d59dcdfd8c71","Type":"ContainerDied","Data":"d9b7406e5ae6b6e6827d3e42930f6d230ba4dff071daed3c932b57616bc01164"} Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.846076 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.857875 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.933838 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczwm\" (UniqueName: \"kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm\") pod \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.934024 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts\") pod \"8422e89a-26b9-4543-b999-720bf0cf7224\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.934048 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts\") pod \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\" (UID: \"530b9e0b-90f2-4b27-9109-d59dcdfd8c71\") " Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.934103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjqm6\" (UniqueName: \"kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6\") pod \"8422e89a-26b9-4543-b999-720bf0cf7224\" (UID: \"8422e89a-26b9-4543-b999-720bf0cf7224\") " Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.934961 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8422e89a-26b9-4543-b999-720bf0cf7224" (UID: "8422e89a-26b9-4543-b999-720bf0cf7224"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.935026 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "530b9e0b-90f2-4b27-9109-d59dcdfd8c71" (UID: "530b9e0b-90f2-4b27-9109-d59dcdfd8c71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.935378 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8422e89a-26b9-4543-b999-720bf0cf7224-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.935448 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.939653 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6" (OuterVolumeSpecName: "kube-api-access-pjqm6") pod "8422e89a-26b9-4543-b999-720bf0cf7224" (UID: "8422e89a-26b9-4543-b999-720bf0cf7224"). InnerVolumeSpecName "kube-api-access-pjqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:44 crc kubenswrapper[4764]: I1204 01:11:44.940030 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm" (OuterVolumeSpecName: "kube-api-access-bczwm") pod "530b9e0b-90f2-4b27-9109-d59dcdfd8c71" (UID: "530b9e0b-90f2-4b27-9109-d59dcdfd8c71"). InnerVolumeSpecName "kube-api-access-bczwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.037401 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjqm6\" (UniqueName: \"kubernetes.io/projected/8422e89a-26b9-4543-b999-720bf0cf7224-kube-api-access-pjqm6\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.037457 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bczwm\" (UniqueName: \"kubernetes.io/projected/530b9e0b-90f2-4b27-9109-d59dcdfd8c71-kube-api-access-bczwm\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.389822 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-t6zlw" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.390016 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-t6zlw" event={"ID":"8422e89a-26b9-4543-b999-720bf0cf7224","Type":"ContainerDied","Data":"90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6"} Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.390056 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90622c0067e1f93e3def11d0324ac49fdaa72029c422202d1a4d1ea5236d76a6" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.402201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5469-account-create-update-9w7d2" event={"ID":"530b9e0b-90f2-4b27-9109-d59dcdfd8c71","Type":"ContainerDied","Data":"14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e"} Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.402269 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14b115f1a33284ce15b3e79448c042fbec83bf22be3cf2914fce8bed749e533e" Dec 04 01:11:45 crc kubenswrapper[4764]: I1204 01:11:45.402290 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5469-account-create-update-9w7d2" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.663612 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fvv6g"] Dec 04 01:11:46 crc kubenswrapper[4764]: E1204 01:11:46.664280 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8422e89a-26b9-4543-b999-720bf0cf7224" containerName="mariadb-database-create" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.664298 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8422e89a-26b9-4543-b999-720bf0cf7224" containerName="mariadb-database-create" Dec 04 01:11:46 crc kubenswrapper[4764]: E1204 01:11:46.664313 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530b9e0b-90f2-4b27-9109-d59dcdfd8c71" containerName="mariadb-account-create-update" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.664321 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="530b9e0b-90f2-4b27-9109-d59dcdfd8c71" containerName="mariadb-account-create-update" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.664607 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8422e89a-26b9-4543-b999-720bf0cf7224" containerName="mariadb-database-create" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.664646 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="530b9e0b-90f2-4b27-9109-d59dcdfd8c71" containerName="mariadb-account-create-update" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.665271 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.667061 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.667070 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tzfbq" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.669695 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.687190 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvv6g"] Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.775123 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.775255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.775472 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m2n7\" (UniqueName: \"kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.877183 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.877359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m2n7\" (UniqueName: \"kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.877482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.882429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.883150 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:46 crc kubenswrapper[4764]: I1204 01:11:46.898194 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m2n7\" (UniqueName: \"kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7\") pod \"neutron-db-sync-fvv6g\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:47 crc kubenswrapper[4764]: I1204 01:11:47.023646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:47 crc kubenswrapper[4764]: I1204 01:11:47.497251 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fvv6g"] Dec 04 01:11:48 crc kubenswrapper[4764]: I1204 01:11:48.432637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvv6g" event={"ID":"f519b7de-08ef-488a-a725-f7a79ec23e1f","Type":"ContainerStarted","Data":"7990d7e5236a93ad7e3b6d29a09ad4f537aef2909ff04feec0063b9903497573"} Dec 04 01:11:48 crc kubenswrapper[4764]: I1204 01:11:48.433376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvv6g" event={"ID":"f519b7de-08ef-488a-a725-f7a79ec23e1f","Type":"ContainerStarted","Data":"8e73701d5069089f48d5573bed4f9d719ae4693852f63b6265709cd0e3d68952"} Dec 04 01:11:48 crc kubenswrapper[4764]: I1204 01:11:48.466237 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fvv6g" podStartSLOduration=2.466209772 podStartE2EDuration="2.466209772s" podCreationTimestamp="2025-12-04 01:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:48.461206759 +0000 UTC m=+5444.222531180" watchObservedRunningTime="2025-12-04 01:11:48.466209772 +0000 UTC m=+5444.227534213" Dec 04 01:11:52 crc kubenswrapper[4764]: I1204 01:11:52.477260 4764 generic.go:334] "Generic (PLEG): container finished" podID="f519b7de-08ef-488a-a725-f7a79ec23e1f" containerID="7990d7e5236a93ad7e3b6d29a09ad4f537aef2909ff04feec0063b9903497573" exitCode=0 Dec 04 01:11:52 crc kubenswrapper[4764]: I1204 01:11:52.477403 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvv6g" event={"ID":"f519b7de-08ef-488a-a725-f7a79ec23e1f","Type":"ContainerDied","Data":"7990d7e5236a93ad7e3b6d29a09ad4f537aef2909ff04feec0063b9903497573"} Dec 04 01:11:53 crc kubenswrapper[4764]: I1204 01:11:53.885843 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.028687 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle\") pod \"f519b7de-08ef-488a-a725-f7a79ec23e1f\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.028843 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config\") pod \"f519b7de-08ef-488a-a725-f7a79ec23e1f\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.028990 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m2n7\" (UniqueName: \"kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7\") pod \"f519b7de-08ef-488a-a725-f7a79ec23e1f\" (UID: \"f519b7de-08ef-488a-a725-f7a79ec23e1f\") " Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.035406 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7" (OuterVolumeSpecName: "kube-api-access-8m2n7") pod "f519b7de-08ef-488a-a725-f7a79ec23e1f" (UID: "f519b7de-08ef-488a-a725-f7a79ec23e1f"). InnerVolumeSpecName "kube-api-access-8m2n7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.070574 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config" (OuterVolumeSpecName: "config") pod "f519b7de-08ef-488a-a725-f7a79ec23e1f" (UID: "f519b7de-08ef-488a-a725-f7a79ec23e1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.071596 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f519b7de-08ef-488a-a725-f7a79ec23e1f" (UID: "f519b7de-08ef-488a-a725-f7a79ec23e1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.132123 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.132217 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f519b7de-08ef-488a-a725-f7a79ec23e1f-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.132244 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m2n7\" (UniqueName: \"kubernetes.io/projected/f519b7de-08ef-488a-a725-f7a79ec23e1f-kube-api-access-8m2n7\") on node \"crc\" DevicePath \"\"" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.503213 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fvv6g" event={"ID":"f519b7de-08ef-488a-a725-f7a79ec23e1f","Type":"ContainerDied","Data":"8e73701d5069089f48d5573bed4f9d719ae4693852f63b6265709cd0e3d68952"} Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.503274 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fvv6g" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.503275 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e73701d5069089f48d5573bed4f9d719ae4693852f63b6265709cd0e3d68952" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.553437 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:11:54 crc kubenswrapper[4764]: E1204 01:11:54.553887 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.778794 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:11:54 crc kubenswrapper[4764]: E1204 01:11:54.779116 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f519b7de-08ef-488a-a725-f7a79ec23e1f" containerName="neutron-db-sync" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.779130 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f519b7de-08ef-488a-a725-f7a79ec23e1f" containerName="neutron-db-sync" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.779325 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f519b7de-08ef-488a-a725-f7a79ec23e1f" containerName="neutron-db-sync" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.780177 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.806274 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.831518 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54f47ffb59-682nm"] Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.833117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.839952 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.840468 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.841080 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-tzfbq" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.845269 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54f47ffb59-682nm"] Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvqw\" (UniqueName: \"kubernetes.io/projected/978cd232-08ca-459a-91b9-a6c1a27ad58e-kube-api-access-2hvqw\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-httpd-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949522 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-combined-ca-bundle\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949673 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949786 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx2vm\" (UniqueName: \"kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949832 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:54 crc kubenswrapper[4764]: I1204 01:11:54.949849 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-httpd-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051766 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-combined-ca-bundle\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051836 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051880 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx2vm\" (UniqueName: \"kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051909 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.051971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvqw\" (UniqueName: \"kubernetes.io/projected/978cd232-08ca-459a-91b9-a6c1a27ad58e-kube-api-access-2hvqw\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.052806 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.052863 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.052956 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.053418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.060397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-httpd-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.065431 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-config\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.065490 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/978cd232-08ca-459a-91b9-a6c1a27ad58e-combined-ca-bundle\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.068510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvqw\" (UniqueName: \"kubernetes.io/projected/978cd232-08ca-459a-91b9-a6c1a27ad58e-kube-api-access-2hvqw\") pod \"neutron-54f47ffb59-682nm\" (UID: \"978cd232-08ca-459a-91b9-a6c1a27ad58e\") " pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.073421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx2vm\" (UniqueName: \"kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm\") pod \"dnsmasq-dns-7d9ff6bf97-fgklj\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.115922 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.155914 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.663363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:11:55 crc kubenswrapper[4764]: I1204 01:11:55.792167 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54f47ffb59-682nm"] Dec 04 01:11:55 crc kubenswrapper[4764]: W1204 01:11:55.810260 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod978cd232_08ca_459a_91b9_a6c1a27ad58e.slice/crio-c90c63ccbdc147aefda9635269afda6ca5b8eac254699961fdfc2a9e8c08a016 WatchSource:0}: Error finding container c90c63ccbdc147aefda9635269afda6ca5b8eac254699961fdfc2a9e8c08a016: Status 404 returned error can't find the container with id c90c63ccbdc147aefda9635269afda6ca5b8eac254699961fdfc2a9e8c08a016 Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.518468 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f47ffb59-682nm" event={"ID":"978cd232-08ca-459a-91b9-a6c1a27ad58e","Type":"ContainerStarted","Data":"79b30e72e098306b03541f9f9f774aadf2bf1bc2c50a929551fef8956b6bbded"} Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.518849 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f47ffb59-682nm" event={"ID":"978cd232-08ca-459a-91b9-a6c1a27ad58e","Type":"ContainerStarted","Data":"aea1f91ade2b36c26c5e9fc2eaf7627cac77f17523b2e2d8c33afc4c71d7bd08"} Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.518873 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.518888 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f47ffb59-682nm" event={"ID":"978cd232-08ca-459a-91b9-a6c1a27ad58e","Type":"ContainerStarted","Data":"c90c63ccbdc147aefda9635269afda6ca5b8eac254699961fdfc2a9e8c08a016"} Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.519785 4764 generic.go:334] "Generic (PLEG): container finished" podID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerID="5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb" exitCode=0 Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.519815 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" event={"ID":"75776f95-41f8-4db4-8a2b-ab60510d54b4","Type":"ContainerDied","Data":"5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb"} Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.519835 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" event={"ID":"75776f95-41f8-4db4-8a2b-ab60510d54b4","Type":"ContainerStarted","Data":"a298aa9e1ff5f5a6e830cff1fa8d63e8aebb5d01f3616ec09bc7e8a20ec82351"} Dec 04 01:11:56 crc kubenswrapper[4764]: I1204 01:11:56.553492 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54f47ffb59-682nm" podStartSLOduration=2.553469531 podStartE2EDuration="2.553469531s" podCreationTimestamp="2025-12-04 01:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:56.548757434 +0000 UTC m=+5452.310081875" watchObservedRunningTime="2025-12-04 01:11:56.553469531 +0000 UTC m=+5452.314793942" Dec 04 01:11:57 crc kubenswrapper[4764]: I1204 01:11:57.533007 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" event={"ID":"75776f95-41f8-4db4-8a2b-ab60510d54b4","Type":"ContainerStarted","Data":"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a"} Dec 04 01:11:57 crc kubenswrapper[4764]: I1204 01:11:57.533371 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:11:57 crc kubenswrapper[4764]: I1204 01:11:57.556681 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" podStartSLOduration=3.5566600680000002 podStartE2EDuration="3.556660068s" podCreationTimestamp="2025-12-04 01:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:11:57.549873171 +0000 UTC m=+5453.311197592" watchObservedRunningTime="2025-12-04 01:11:57.556660068 +0000 UTC m=+5453.317984479" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.118175 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.189783 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.190090 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f76578655-fgp47" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="dnsmasq-dns" containerID="cri-o://d07bcc7f5d09efdcce95c3ac207c5406706f295647eedbe6721c166617cee1e9" gracePeriod=10 Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.606710 4764 generic.go:334] "Generic (PLEG): container finished" podID="c4eafd88-870a-463d-9138-01c26058cc32" containerID="d07bcc7f5d09efdcce95c3ac207c5406706f295647eedbe6721c166617cee1e9" exitCode=0 Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.607015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f76578655-fgp47" event={"ID":"c4eafd88-870a-463d-9138-01c26058cc32","Type":"ContainerDied","Data":"d07bcc7f5d09efdcce95c3ac207c5406706f295647eedbe6721c166617cee1e9"} Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.607047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f76578655-fgp47" event={"ID":"c4eafd88-870a-463d-9138-01c26058cc32","Type":"ContainerDied","Data":"db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6"} Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.607075 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db8b31fc7f1b4a1220aad12611b45c8ab8b4c257a6cc546c9250fa90b9c184f6" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.669508 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.745365 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb\") pod \"c4eafd88-870a-463d-9138-01c26058cc32\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.745408 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d7gn\" (UniqueName: \"kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn\") pod \"c4eafd88-870a-463d-9138-01c26058cc32\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.745454 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc\") pod \"c4eafd88-870a-463d-9138-01c26058cc32\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.745607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config\") pod \"c4eafd88-870a-463d-9138-01c26058cc32\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.745625 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb\") pod \"c4eafd88-870a-463d-9138-01c26058cc32\" (UID: \"c4eafd88-870a-463d-9138-01c26058cc32\") " Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.769922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn" (OuterVolumeSpecName: "kube-api-access-7d7gn") pod "c4eafd88-870a-463d-9138-01c26058cc32" (UID: "c4eafd88-870a-463d-9138-01c26058cc32"). InnerVolumeSpecName "kube-api-access-7d7gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.800452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4eafd88-870a-463d-9138-01c26058cc32" (UID: "c4eafd88-870a-463d-9138-01c26058cc32"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.803075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4eafd88-870a-463d-9138-01c26058cc32" (UID: "c4eafd88-870a-463d-9138-01c26058cc32"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.807572 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config" (OuterVolumeSpecName: "config") pod "c4eafd88-870a-463d-9138-01c26058cc32" (UID: "c4eafd88-870a-463d-9138-01c26058cc32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.810382 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4eafd88-870a-463d-9138-01c26058cc32" (UID: "c4eafd88-870a-463d-9138-01c26058cc32"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.847642 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.847671 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.847681 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.847690 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4eafd88-870a-463d-9138-01c26058cc32-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:05 crc kubenswrapper[4764]: I1204 01:12:05.847699 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d7gn\" (UniqueName: \"kubernetes.io/projected/c4eafd88-870a-463d-9138-01c26058cc32-kube-api-access-7d7gn\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:06 crc kubenswrapper[4764]: I1204 01:12:06.624255 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f76578655-fgp47" Dec 04 01:12:06 crc kubenswrapper[4764]: I1204 01:12:06.646287 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:12:06 crc kubenswrapper[4764]: I1204 01:12:06.653008 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f76578655-fgp47"] Dec 04 01:12:07 crc kubenswrapper[4764]: I1204 01:12:07.545466 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:12:07 crc kubenswrapper[4764]: E1204 01:12:07.546083 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:12:08 crc kubenswrapper[4764]: I1204 01:12:08.569897 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4eafd88-870a-463d-9138-01c26058cc32" path="/var/lib/kubelet/pods/c4eafd88-870a-463d-9138-01c26058cc32/volumes" Dec 04 01:12:20 crc kubenswrapper[4764]: I1204 01:12:20.545538 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:12:20 crc kubenswrapper[4764]: E1204 01:12:20.546423 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:12:25 crc kubenswrapper[4764]: I1204 01:12:25.170125 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54f47ffb59-682nm" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.050315 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rrg6h"] Dec 04 01:12:32 crc kubenswrapper[4764]: E1204 01:12:32.051301 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="init" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.051318 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="init" Dec 04 01:12:32 crc kubenswrapper[4764]: E1204 01:12:32.051334 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="dnsmasq-dns" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.051340 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="dnsmasq-dns" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.051524 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4eafd88-870a-463d-9138-01c26058cc32" containerName="dnsmasq-dns" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.052141 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.059984 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rrg6h"] Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.113725 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwvb6\" (UniqueName: \"kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.114063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.144626 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b12d-account-create-update-dhqvn"] Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.146253 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.148510 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.154245 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b12d-account-create-update-dhqvn"] Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.216194 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.216252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58j4\" (UniqueName: \"kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.216321 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwvb6\" (UniqueName: \"kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.216417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.217281 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.236690 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwvb6\" (UniqueName: \"kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6\") pod \"glance-db-create-rrg6h\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.317247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.317583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58j4\" (UniqueName: \"kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.318341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.334515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58j4\" (UniqueName: \"kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4\") pod \"glance-b12d-account-create-update-dhqvn\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.380019 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.490317 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.873889 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rrg6h"] Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.884208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rrg6h" event={"ID":"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e","Type":"ContainerStarted","Data":"4ddef0dc6962eb55c0de99b47423d4805c9634396ccdf568003b31c1b64ab903"} Dec 04 01:12:32 crc kubenswrapper[4764]: I1204 01:12:32.942451 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b12d-account-create-update-dhqvn"] Dec 04 01:12:32 crc kubenswrapper[4764]: W1204 01:12:32.954131 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e11af5e_9683_4a35_9174_af685786f621.slice/crio-ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b WatchSource:0}: Error finding container ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b: Status 404 returned error can't find the container with id ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b Dec 04 01:12:33 crc kubenswrapper[4764]: I1204 01:12:33.899087 4764 generic.go:334] "Generic (PLEG): container finished" podID="2e11af5e-9683-4a35-9174-af685786f621" containerID="d5c1d76698af66d19b806724063cd0182c5c74477e6a32d5519018720bf77beb" exitCode=0 Dec 04 01:12:33 crc kubenswrapper[4764]: I1204 01:12:33.899549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b12d-account-create-update-dhqvn" event={"ID":"2e11af5e-9683-4a35-9174-af685786f621","Type":"ContainerDied","Data":"d5c1d76698af66d19b806724063cd0182c5c74477e6a32d5519018720bf77beb"} Dec 04 01:12:33 crc kubenswrapper[4764]: I1204 01:12:33.899586 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b12d-account-create-update-dhqvn" event={"ID":"2e11af5e-9683-4a35-9174-af685786f621","Type":"ContainerStarted","Data":"ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b"} Dec 04 01:12:33 crc kubenswrapper[4764]: I1204 01:12:33.902525 4764 generic.go:334] "Generic (PLEG): container finished" podID="b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" containerID="0b0fdf979fbceb888396dd108d754b0e720e42d387e7eef147f3e2a44b2b32eb" exitCode=0 Dec 04 01:12:33 crc kubenswrapper[4764]: I1204 01:12:33.902563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rrg6h" event={"ID":"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e","Type":"ContainerDied","Data":"0b0fdf979fbceb888396dd108d754b0e720e42d387e7eef147f3e2a44b2b32eb"} Dec 04 01:12:34 crc kubenswrapper[4764]: I1204 01:12:34.555620 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:12:34 crc kubenswrapper[4764]: E1204 01:12:34.556445 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.372282 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.377337 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwvb6\" (UniqueName: \"kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6\") pod \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.377520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts\") pod \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\" (UID: \"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e\") " Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.378338 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" (UID: "b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.379942 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.391708 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6" (OuterVolumeSpecName: "kube-api-access-wwvb6") pod "b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" (UID: "b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e"). InnerVolumeSpecName "kube-api-access-wwvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.482318 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58j4\" (UniqueName: \"kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4\") pod \"2e11af5e-9683-4a35-9174-af685786f621\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.482538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts\") pod \"2e11af5e-9683-4a35-9174-af685786f621\" (UID: \"2e11af5e-9683-4a35-9174-af685786f621\") " Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.483004 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwvb6\" (UniqueName: \"kubernetes.io/projected/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-kube-api-access-wwvb6\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.483027 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.491508 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e11af5e-9683-4a35-9174-af685786f621" (UID: "2e11af5e-9683-4a35-9174-af685786f621"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.491929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4" (OuterVolumeSpecName: "kube-api-access-x58j4") pod "2e11af5e-9683-4a35-9174-af685786f621" (UID: "2e11af5e-9683-4a35-9174-af685786f621"). InnerVolumeSpecName "kube-api-access-x58j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.585215 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e11af5e-9683-4a35-9174-af685786f621-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.585245 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58j4\" (UniqueName: \"kubernetes.io/projected/2e11af5e-9683-4a35-9174-af685786f621-kube-api-access-x58j4\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.920861 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b12d-account-create-update-dhqvn" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.920860 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b12d-account-create-update-dhqvn" event={"ID":"2e11af5e-9683-4a35-9174-af685786f621","Type":"ContainerDied","Data":"ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b"} Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.921253 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca038430ef5e119e8e93e72d1928a9da1bd94921ac0e6c8f11ea31454ebec76b" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.922697 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rrg6h" event={"ID":"b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e","Type":"ContainerDied","Data":"4ddef0dc6962eb55c0de99b47423d4805c9634396ccdf568003b31c1b64ab903"} Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.922861 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ddef0dc6962eb55c0de99b47423d4805c9634396ccdf568003b31c1b64ab903" Dec 04 01:12:35 crc kubenswrapper[4764]: I1204 01:12:35.922805 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rrg6h" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.216679 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-44kxf"] Dec 04 01:12:37 crc kubenswrapper[4764]: E1204 01:12:37.217385 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" containerName="mariadb-database-create" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.217401 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" containerName="mariadb-database-create" Dec 04 01:12:37 crc kubenswrapper[4764]: E1204 01:12:37.217423 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e11af5e-9683-4a35-9174-af685786f621" containerName="mariadb-account-create-update" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.217430 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e11af5e-9683-4a35-9174-af685786f621" containerName="mariadb-account-create-update" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.217630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" containerName="mariadb-database-create" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.217656 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e11af5e-9683-4a35-9174-af685786f621" containerName="mariadb-account-create-update" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.219584 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.223593 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.223806 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4bq64" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.233454 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-44kxf"] Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.314864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.314921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.315085 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwx2r\" (UniqueName: \"kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.315180 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.415811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwx2r\" (UniqueName: \"kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.415881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.415958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.415989 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.425312 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.435116 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.435574 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.456948 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwx2r\" (UniqueName: \"kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r\") pod \"glance-db-sync-44kxf\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:37 crc kubenswrapper[4764]: I1204 01:12:37.537996 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:38 crc kubenswrapper[4764]: I1204 01:12:38.162763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-44kxf"] Dec 04 01:12:38 crc kubenswrapper[4764]: I1204 01:12:38.956667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-44kxf" event={"ID":"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62","Type":"ContainerStarted","Data":"33dbced410566aca91de81299f67c654162c0cc7fb3bf152da675bf0311337b4"} Dec 04 01:12:38 crc kubenswrapper[4764]: I1204 01:12:38.957047 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-44kxf" event={"ID":"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62","Type":"ContainerStarted","Data":"7ede90e7a36fa2a69d6ec4eb66b95bc7cf7c971f24aac294d4cd695252833b25"} Dec 04 01:12:38 crc kubenswrapper[4764]: I1204 01:12:38.976684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-44kxf" podStartSLOduration=1.9766639750000001 podStartE2EDuration="1.976663975s" podCreationTimestamp="2025-12-04 01:12:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:38.969563729 +0000 UTC m=+5494.730888140" watchObservedRunningTime="2025-12-04 01:12:38.976663975 +0000 UTC m=+5494.737988386" Dec 04 01:12:43 crc kubenswrapper[4764]: I1204 01:12:43.003631 4764 generic.go:334] "Generic (PLEG): container finished" podID="16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" containerID="33dbced410566aca91de81299f67c654162c0cc7fb3bf152da675bf0311337b4" exitCode=0 Dec 04 01:12:43 crc kubenswrapper[4764]: I1204 01:12:43.003764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-44kxf" event={"ID":"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62","Type":"ContainerDied","Data":"33dbced410566aca91de81299f67c654162c0cc7fb3bf152da675bf0311337b4"} Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.431358 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.544668 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data\") pod \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.544746 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwx2r\" (UniqueName: \"kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r\") pod \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.544871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data\") pod \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.544909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle\") pod \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\" (UID: \"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62\") " Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.552313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" (UID: "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.555407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r" (OuterVolumeSpecName: "kube-api-access-jwx2r") pod "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" (UID: "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62"). InnerVolumeSpecName "kube-api-access-jwx2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.571407 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" (UID: "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.605931 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data" (OuterVolumeSpecName: "config-data") pod "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" (UID: "16dea2f4-a927-4fcb-aca3-3ae06d5f7d62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.646462 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.646682 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwx2r\" (UniqueName: \"kubernetes.io/projected/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-kube-api-access-jwx2r\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.646782 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:44 crc kubenswrapper[4764]: I1204 01:12:44.646870 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.025796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-44kxf" event={"ID":"16dea2f4-a927-4fcb-aca3-3ae06d5f7d62","Type":"ContainerDied","Data":"7ede90e7a36fa2a69d6ec4eb66b95bc7cf7c971f24aac294d4cd695252833b25"} Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.025851 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ede90e7a36fa2a69d6ec4eb66b95bc7cf7c971f24aac294d4cd695252833b25" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.025877 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-44kxf" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.440738 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:12:45 crc kubenswrapper[4764]: E1204 01:12:45.441322 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" containerName="glance-db-sync" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.441333 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" containerName="glance-db-sync" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.441518 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" containerName="glance-db-sync" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.442368 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.453236 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.467931 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.473385 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.480018 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.480099 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.480132 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.480250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4bq64" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.504565 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.556954 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.558426 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq4b5\" (UniqueName: \"kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562221 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562276 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.562611 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.568518 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g6h2\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663706 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663829 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.663983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664003 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86kxr\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664029 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664534 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664580 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664744 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664846 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664866 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.664939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq4b5\" (UniqueName: \"kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.665509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.665801 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.666234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.681366 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq4b5\" (UniqueName: \"kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5\") pod \"dnsmasq-dns-5c579fb595-gwnwm\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.764812 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.765988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766015 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766030 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766083 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766121 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g6h2\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766198 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766216 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766232 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766276 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766298 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766317 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86kxr\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.766941 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.768108 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.768436 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.768432 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.772422 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.772535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.773105 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.774594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.774601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.775899 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.783466 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.786159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86kxr\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.789271 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph\") pod \"glance-default-external-api-0\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.789707 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g6h2\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2\") pod \"glance-default-internal-api-0\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.801178 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:45 crc kubenswrapper[4764]: I1204 01:12:45.888842 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:46 crc kubenswrapper[4764]: I1204 01:12:46.233860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:12:46 crc kubenswrapper[4764]: W1204 01:12:46.243410 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ba57644_78d0_46d0_aae7_98efa9ce7465.slice/crio-952efe510501ef897f99d23ce55148f6a9e5bcfb6c5936c1edff771057976212 WatchSource:0}: Error finding container 952efe510501ef897f99d23ce55148f6a9e5bcfb6c5936c1edff771057976212: Status 404 returned error can't find the container with id 952efe510501ef897f99d23ce55148f6a9e5bcfb6c5936c1edff771057976212 Dec 04 01:12:46 crc kubenswrapper[4764]: I1204 01:12:46.497974 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:46 crc kubenswrapper[4764]: I1204 01:12:46.532636 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:46 crc kubenswrapper[4764]: W1204 01:12:46.557809 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575a6c8c_8778_407f_8656_e772257d4a53.slice/crio-19e38c56bece52e3c10e28b46a0becc0137b73c348d325795cc5517ec2b3d1df WatchSource:0}: Error finding container 19e38c56bece52e3c10e28b46a0becc0137b73c348d325795cc5517ec2b3d1df: Status 404 returned error can't find the container with id 19e38c56bece52e3c10e28b46a0becc0137b73c348d325795cc5517ec2b3d1df Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.049546 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerID="f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c" exitCode=0 Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.049608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" event={"ID":"0ba57644-78d0-46d0-aae7-98efa9ce7465","Type":"ContainerDied","Data":"f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c"} Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.050259 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" event={"ID":"0ba57644-78d0-46d0-aae7-98efa9ce7465","Type":"ContainerStarted","Data":"952efe510501ef897f99d23ce55148f6a9e5bcfb6c5936c1edff771057976212"} Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.052811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerStarted","Data":"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea"} Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.052848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerStarted","Data":"19e38c56bece52e3c10e28b46a0becc0137b73c348d325795cc5517ec2b3d1df"} Dec 04 01:12:47 crc kubenswrapper[4764]: I1204 01:12:47.399417 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:47 crc kubenswrapper[4764]: W1204 01:12:47.410038 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd34cfd21_a86f_43d6_bf89_1a1539351d75.slice/crio-62d037b7dc8fdfb71aa011a365119e8d25aa5447d9d5bc56af0ef9c26b55a0a7 WatchSource:0}: Error finding container 62d037b7dc8fdfb71aa011a365119e8d25aa5447d9d5bc56af0ef9c26b55a0a7: Status 404 returned error can't find the container with id 62d037b7dc8fdfb71aa011a365119e8d25aa5447d9d5bc56af0ef9c26b55a0a7 Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.072315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerStarted","Data":"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db"} Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.072364 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-log" containerID="cri-o://167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" gracePeriod=30 Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.072617 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-httpd" containerID="cri-o://633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" gracePeriod=30 Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.078262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerStarted","Data":"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2"} Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.078305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerStarted","Data":"62d037b7dc8fdfb71aa011a365119e8d25aa5447d9d5bc56af0ef9c26b55a0a7"} Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.082242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" event={"ID":"0ba57644-78d0-46d0-aae7-98efa9ce7465","Type":"ContainerStarted","Data":"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723"} Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.082988 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.099975 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.099957585 podStartE2EDuration="3.099957585s" podCreationTimestamp="2025-12-04 01:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:48.094900461 +0000 UTC m=+5503.856224882" watchObservedRunningTime="2025-12-04 01:12:48.099957585 +0000 UTC m=+5503.861281996" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.119089 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" podStartSLOduration=3.119070776 podStartE2EDuration="3.119070776s" podCreationTimestamp="2025-12-04 01:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:48.109364287 +0000 UTC m=+5503.870688698" watchObservedRunningTime="2025-12-04 01:12:48.119070776 +0000 UTC m=+5503.880395187" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.305032 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.637251 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727537 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727653 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727699 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727739 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727821 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.727984 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86kxr\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.728073 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data\") pod \"575a6c8c-8778-407f-8656-e772257d4a53\" (UID: \"575a6c8c-8778-407f-8656-e772257d4a53\") " Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.728114 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs" (OuterVolumeSpecName: "logs") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.728182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.728589 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.728615 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/575a6c8c-8778-407f-8656-e772257d4a53-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.734519 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph" (OuterVolumeSpecName: "ceph") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.734883 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts" (OuterVolumeSpecName: "scripts") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.736269 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr" (OuterVolumeSpecName: "kube-api-access-86kxr") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "kube-api-access-86kxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.759975 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.785856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data" (OuterVolumeSpecName: "config-data") pod "575a6c8c-8778-407f-8656-e772257d4a53" (UID: "575a6c8c-8778-407f-8656-e772257d4a53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.829951 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.829984 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.829994 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.830002 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86kxr\" (UniqueName: \"kubernetes.io/projected/575a6c8c-8778-407f-8656-e772257d4a53-kube-api-access-86kxr\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:48 crc kubenswrapper[4764]: I1204 01:12:48.830012 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575a6c8c-8778-407f-8656-e772257d4a53-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.091583 4764 generic.go:334] "Generic (PLEG): container finished" podID="575a6c8c-8778-407f-8656-e772257d4a53" containerID="633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" exitCode=0 Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.091912 4764 generic.go:334] "Generic (PLEG): container finished" podID="575a6c8c-8778-407f-8656-e772257d4a53" containerID="167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" exitCode=143 Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.091646 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.091670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerDied","Data":"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db"} Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.092019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerDied","Data":"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea"} Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.092046 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"575a6c8c-8778-407f-8656-e772257d4a53","Type":"ContainerDied","Data":"19e38c56bece52e3c10e28b46a0becc0137b73c348d325795cc5517ec2b3d1df"} Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.092068 4764 scope.go:117] "RemoveContainer" containerID="633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.095110 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-log" containerID="cri-o://fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" gracePeriod=30 Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.095236 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-httpd" containerID="cri-o://1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" gracePeriod=30 Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.095348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerStarted","Data":"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333"} Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.130291 4764 scope.go:117] "RemoveContainer" containerID="167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.130702 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.130678118 podStartE2EDuration="4.130678118s" podCreationTimestamp="2025-12-04 01:12:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:49.117862713 +0000 UTC m=+5504.879187144" watchObservedRunningTime="2025-12-04 01:12:49.130678118 +0000 UTC m=+5504.892002529" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.142285 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.147205 4764 scope.go:117] "RemoveContainer" containerID="633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" Dec 04 01:12:49 crc kubenswrapper[4764]: E1204 01:12:49.147587 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db\": container with ID starting with 633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db not found: ID does not exist" containerID="633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.147624 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db"} err="failed to get container status \"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db\": rpc error: code = NotFound desc = could not find container \"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db\": container with ID starting with 633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db not found: ID does not exist" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.147649 4764 scope.go:117] "RemoveContainer" containerID="167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" Dec 04 01:12:49 crc kubenswrapper[4764]: E1204 01:12:49.149119 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea\": container with ID starting with 167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea not found: ID does not exist" containerID="167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.149146 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea"} err="failed to get container status \"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea\": rpc error: code = NotFound desc = could not find container \"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea\": container with ID starting with 167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea not found: ID does not exist" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.149163 4764 scope.go:117] "RemoveContainer" containerID="633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.149329 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.149559 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db"} err="failed to get container status \"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db\": rpc error: code = NotFound desc = could not find container \"633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db\": container with ID starting with 633f4200b939327c1ff11f7c7d8af4365fb997cd8d4b1d0876ebd0143ee258db not found: ID does not exist" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.149755 4764 scope.go:117] "RemoveContainer" containerID="167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.150258 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea"} err="failed to get container status \"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea\": rpc error: code = NotFound desc = could not find container \"167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea\": container with ID starting with 167e82169d0e8d2cf30cca2fa1e0561c8768f69564fdbcfd9c9f1b540cb664ea not found: ID does not exist" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.168625 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:49 crc kubenswrapper[4764]: E1204 01:12:49.169178 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-httpd" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.169280 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-httpd" Dec 04 01:12:49 crc kubenswrapper[4764]: E1204 01:12:49.169357 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-log" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.169409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-log" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.169624 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-httpd" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.169685 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="575a6c8c-8778-407f-8656-e772257d4a53" containerName="glance-log" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.170616 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.176494 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.185938 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.338670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.338748 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.338843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.338911 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.338942 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.339012 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.339056 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z98w\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.440418 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.440483 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.440542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.440588 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z98w\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.441156 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.441350 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.441403 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.441428 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.441819 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.445566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.446579 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.446846 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.446933 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.459462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z98w\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w\") pod \"glance-default-external-api-0\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.493841 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.545867 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:12:49 crc kubenswrapper[4764]: E1204 01:12:49.546040 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.659924 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.847673 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.847795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.847994 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g6h2\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.848040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.848103 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.848173 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.848228 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run\") pod \"d34cfd21-a86f-43d6-bf89-1a1539351d75\" (UID: \"d34cfd21-a86f-43d6-bf89-1a1539351d75\") " Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.848867 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.849134 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs" (OuterVolumeSpecName: "logs") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.852711 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph" (OuterVolumeSpecName: "ceph") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.852831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2" (OuterVolumeSpecName: "kube-api-access-7g6h2") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "kube-api-access-7g6h2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.853442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts" (OuterVolumeSpecName: "scripts") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.876790 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.910195 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data" (OuterVolumeSpecName: "config-data") pod "d34cfd21-a86f-43d6-bf89-1a1539351d75" (UID: "d34cfd21-a86f-43d6-bf89-1a1539351d75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954529 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954560 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d34cfd21-a86f-43d6-bf89-1a1539351d75-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954569 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954578 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g6h2\" (UniqueName: \"kubernetes.io/projected/d34cfd21-a86f-43d6-bf89-1a1539351d75-kube-api-access-7g6h2\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954592 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954601 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:49 crc kubenswrapper[4764]: I1204 01:12:49.954611 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d34cfd21-a86f-43d6-bf89-1a1539351d75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.058390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:12:50 crc kubenswrapper[4764]: W1204 01:12:50.059261 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42df602_041a_4cce_9576_fa19d5cb9750.slice/crio-80a395fec67ba56e529f505f520f0208ede1113a1bc48e4a842ff8967274ccec WatchSource:0}: Error finding container 80a395fec67ba56e529f505f520f0208ede1113a1bc48e4a842ff8967274ccec: Status 404 returned error can't find the container with id 80a395fec67ba56e529f505f520f0208ede1113a1bc48e4a842ff8967274ccec Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.112967 4764 generic.go:334] "Generic (PLEG): container finished" podID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerID="1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" exitCode=0 Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.112996 4764 generic.go:334] "Generic (PLEG): container finished" podID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerID="fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" exitCode=143 Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.113038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerDied","Data":"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333"} Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.113066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerDied","Data":"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2"} Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.113079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d34cfd21-a86f-43d6-bf89-1a1539351d75","Type":"ContainerDied","Data":"62d037b7dc8fdfb71aa011a365119e8d25aa5447d9d5bc56af0ef9c26b55a0a7"} Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.113096 4764 scope.go:117] "RemoveContainer" containerID="1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.113221 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.114954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerStarted","Data":"80a395fec67ba56e529f505f520f0208ede1113a1bc48e4a842ff8967274ccec"} Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.141193 4764 scope.go:117] "RemoveContainer" containerID="fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.147231 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.169774 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178164 4764 scope.go:117] "RemoveContainer" containerID="1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178259 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:50 crc kubenswrapper[4764]: E1204 01:12:50.178580 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-httpd" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178593 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-httpd" Dec 04 01:12:50 crc kubenswrapper[4764]: E1204 01:12:50.178612 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-log" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178617 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-log" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178820 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-httpd" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.178837 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" containerName="glance-log" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.179646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: E1204 01:12:50.180366 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333\": container with ID starting with 1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333 not found: ID does not exist" containerID="1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.180389 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333"} err="failed to get container status \"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333\": rpc error: code = NotFound desc = could not find container \"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333\": container with ID starting with 1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333 not found: ID does not exist" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.180407 4764 scope.go:117] "RemoveContainer" containerID="fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" Dec 04 01:12:50 crc kubenswrapper[4764]: E1204 01:12:50.180885 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2\": container with ID starting with fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2 not found: ID does not exist" containerID="fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.180899 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2"} err="failed to get container status \"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2\": rpc error: code = NotFound desc = could not find container \"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2\": container with ID starting with fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2 not found: ID does not exist" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.180911 4764 scope.go:117] "RemoveContainer" containerID="1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.181101 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333"} err="failed to get container status \"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333\": rpc error: code = NotFound desc = could not find container \"1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333\": container with ID starting with 1496b9e5a0b75547dca620082c3ffed30a0c541ba8c6483d3b22399c25de7333 not found: ID does not exist" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.181114 4764 scope.go:117] "RemoveContainer" containerID="fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.181244 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2"} err="failed to get container status \"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2\": rpc error: code = NotFound desc = could not find container \"fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2\": container with ID starting with fdb518479b2b730dcd26b8e5f823abc22bd354fabfe69ac0921ada138861cbd2 not found: ID does not exist" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.182183 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.184673 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrf9q\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362857 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362894 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.362934 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrf9q\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465516 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465635 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465740 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.465948 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.466267 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.469478 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.470479 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.471089 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.471165 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.500619 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrf9q\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q\") pod \"glance-default-internal-api-0\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.530246 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.555213 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575a6c8c-8778-407f-8656-e772257d4a53" path="/var/lib/kubelet/pods/575a6c8c-8778-407f-8656-e772257d4a53/volumes" Dec 04 01:12:50 crc kubenswrapper[4764]: I1204 01:12:50.556096 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d34cfd21-a86f-43d6-bf89-1a1539351d75" path="/var/lib/kubelet/pods/d34cfd21-a86f-43d6-bf89-1a1539351d75/volumes" Dec 04 01:12:51 crc kubenswrapper[4764]: I1204 01:12:51.782019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerStarted","Data":"02caf2897aeae1c89816e55fc6b0cf6dc8791167a7e894adf8747732e4815bec"} Dec 04 01:12:51 crc kubenswrapper[4764]: I1204 01:12:51.868431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:12:51 crc kubenswrapper[4764]: W1204 01:12:51.877651 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5509e2e8_e0a9_48cd_9afa_17dfa3b38369.slice/crio-b3c2f52117edee2edf365741258497553bd92386653c19e29301c85db79d4bfc WatchSource:0}: Error finding container b3c2f52117edee2edf365741258497553bd92386653c19e29301c85db79d4bfc: Status 404 returned error can't find the container with id b3c2f52117edee2edf365741258497553bd92386653c19e29301c85db79d4bfc Dec 04 01:12:52 crc kubenswrapper[4764]: I1204 01:12:52.801321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerStarted","Data":"084829b6a456cb2942d5b33745e4a36ecc94629b56ddfd24cbd5605ca415e25f"} Dec 04 01:12:52 crc kubenswrapper[4764]: I1204 01:12:52.805944 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerStarted","Data":"65b03840f2954627f6af8c2beb5d803307b845e7120689e81e966028c4620286"} Dec 04 01:12:52 crc kubenswrapper[4764]: I1204 01:12:52.806000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerStarted","Data":"b3c2f52117edee2edf365741258497553bd92386653c19e29301c85db79d4bfc"} Dec 04 01:12:52 crc kubenswrapper[4764]: I1204 01:12:52.836312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.836287677 podStartE2EDuration="3.836287677s" podCreationTimestamp="2025-12-04 01:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:52.821550064 +0000 UTC m=+5508.582874485" watchObservedRunningTime="2025-12-04 01:12:52.836287677 +0000 UTC m=+5508.597612098" Dec 04 01:12:53 crc kubenswrapper[4764]: I1204 01:12:53.821877 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerStarted","Data":"aa91d592b342f511954169a3a8530652656813d36b12423829a4c4066288ce44"} Dec 04 01:12:53 crc kubenswrapper[4764]: I1204 01:12:53.855394 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.855372893 podStartE2EDuration="3.855372893s" podCreationTimestamp="2025-12-04 01:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:12:53.853952418 +0000 UTC m=+5509.615276869" watchObservedRunningTime="2025-12-04 01:12:53.855372893 +0000 UTC m=+5509.616697314" Dec 04 01:12:55 crc kubenswrapper[4764]: I1204 01:12:55.768776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:12:55 crc kubenswrapper[4764]: I1204 01:12:55.838893 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:12:55 crc kubenswrapper[4764]: I1204 01:12:55.839567 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="dnsmasq-dns" containerID="cri-o://723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a" gracePeriod=10 Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.301635 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.452703 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb\") pod \"75776f95-41f8-4db4-8a2b-ab60510d54b4\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.452951 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config\") pod \"75776f95-41f8-4db4-8a2b-ab60510d54b4\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.452979 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb\") pod \"75776f95-41f8-4db4-8a2b-ab60510d54b4\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.453049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc\") pod \"75776f95-41f8-4db4-8a2b-ab60510d54b4\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.453093 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx2vm\" (UniqueName: \"kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm\") pod \"75776f95-41f8-4db4-8a2b-ab60510d54b4\" (UID: \"75776f95-41f8-4db4-8a2b-ab60510d54b4\") " Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.458578 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm" (OuterVolumeSpecName: "kube-api-access-qx2vm") pod "75776f95-41f8-4db4-8a2b-ab60510d54b4" (UID: "75776f95-41f8-4db4-8a2b-ab60510d54b4"). InnerVolumeSpecName "kube-api-access-qx2vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.500755 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75776f95-41f8-4db4-8a2b-ab60510d54b4" (UID: "75776f95-41f8-4db4-8a2b-ab60510d54b4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.501050 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "75776f95-41f8-4db4-8a2b-ab60510d54b4" (UID: "75776f95-41f8-4db4-8a2b-ab60510d54b4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.513304 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "75776f95-41f8-4db4-8a2b-ab60510d54b4" (UID: "75776f95-41f8-4db4-8a2b-ab60510d54b4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.529834 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config" (OuterVolumeSpecName: "config") pod "75776f95-41f8-4db4-8a2b-ab60510d54b4" (UID: "75776f95-41f8-4db4-8a2b-ab60510d54b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.554255 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.554286 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.554296 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.554306 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx2vm\" (UniqueName: \"kubernetes.io/projected/75776f95-41f8-4db4-8a2b-ab60510d54b4-kube-api-access-qx2vm\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.554315 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/75776f95-41f8-4db4-8a2b-ab60510d54b4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.863644 4764 generic.go:334] "Generic (PLEG): container finished" podID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerID="723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a" exitCode=0 Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.863702 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" event={"ID":"75776f95-41f8-4db4-8a2b-ab60510d54b4","Type":"ContainerDied","Data":"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a"} Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.863761 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" event={"ID":"75776f95-41f8-4db4-8a2b-ab60510d54b4","Type":"ContainerDied","Data":"a298aa9e1ff5f5a6e830cff1fa8d63e8aebb5d01f3616ec09bc7e8a20ec82351"} Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.863780 4764 scope.go:117] "RemoveContainer" containerID="723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.864928 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d9ff6bf97-fgklj" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.906576 4764 scope.go:117] "RemoveContainer" containerID="5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.907870 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.923098 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d9ff6bf97-fgklj"] Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.937998 4764 scope.go:117] "RemoveContainer" containerID="723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a" Dec 04 01:12:56 crc kubenswrapper[4764]: E1204 01:12:56.938527 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a\": container with ID starting with 723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a not found: ID does not exist" containerID="723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.938611 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a"} err="failed to get container status \"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a\": rpc error: code = NotFound desc = could not find container \"723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a\": container with ID starting with 723b64bc514fd2d3ebbe1fa13d5bcf09627908c2fa061222eb8ad5779b2b404a not found: ID does not exist" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.938685 4764 scope.go:117] "RemoveContainer" containerID="5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb" Dec 04 01:12:56 crc kubenswrapper[4764]: E1204 01:12:56.939231 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb\": container with ID starting with 5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb not found: ID does not exist" containerID="5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb" Dec 04 01:12:56 crc kubenswrapper[4764]: I1204 01:12:56.939307 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb"} err="failed to get container status \"5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb\": rpc error: code = NotFound desc = could not find container \"5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb\": container with ID starting with 5c96d046fa8bda94e74a01c71354237eeea4000bf654455baac0e7b241cc73eb not found: ID does not exist" Dec 04 01:12:58 crc kubenswrapper[4764]: I1204 01:12:58.562007 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" path="/var/lib/kubelet/pods/75776f95-41f8-4db4-8a2b-ab60510d54b4/volumes" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.494561 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.496294 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.524152 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.531088 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.898955 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 01:12:59 crc kubenswrapper[4764]: I1204 01:12:59.899002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.530564 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.530896 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.545289 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:13:00 crc kubenswrapper[4764]: E1204 01:13:00.545557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.558128 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.567684 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.907617 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:00 crc kubenswrapper[4764]: I1204 01:13:00.907666 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:01 crc kubenswrapper[4764]: I1204 01:13:01.951307 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 01:13:01 crc kubenswrapper[4764]: I1204 01:13:01.951426 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 01:13:01 crc kubenswrapper[4764]: I1204 01:13:01.963415 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 01:13:02 crc kubenswrapper[4764]: I1204 01:13:02.887817 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:02 crc kubenswrapper[4764]: I1204 01:13:02.923797 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.510268 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m9pr5"] Dec 04 01:13:10 crc kubenswrapper[4764]: E1204 01:13:10.511060 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="init" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.511080 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="init" Dec 04 01:13:10 crc kubenswrapper[4764]: E1204 01:13:10.511106 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="dnsmasq-dns" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.511112 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="dnsmasq-dns" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.511273 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="75776f95-41f8-4db4-8a2b-ab60510d54b4" containerName="dnsmasq-dns" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.511886 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.518481 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m9pr5"] Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.611615 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzbk\" (UniqueName: \"kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.611790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.614881 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-eb72-account-create-update-6hb8g"] Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.616285 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.618105 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.624389 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb72-account-create-update-6hb8g"] Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.713255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzbk\" (UniqueName: \"kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.713336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwn5f\" (UniqueName: \"kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.713386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.713422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.714279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.738875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzbk\" (UniqueName: \"kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk\") pod \"placement-db-create-m9pr5\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.814893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwn5f\" (UniqueName: \"kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.814983 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.815781 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.831566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwn5f\" (UniqueName: \"kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f\") pod \"placement-eb72-account-create-update-6hb8g\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.837508 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:10 crc kubenswrapper[4764]: I1204 01:13:10.937918 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:11 crc kubenswrapper[4764]: I1204 01:13:11.354851 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m9pr5"] Dec 04 01:13:11 crc kubenswrapper[4764]: I1204 01:13:11.422092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb72-account-create-update-6hb8g"] Dec 04 01:13:11 crc kubenswrapper[4764]: W1204 01:13:11.428147 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbc5ed6_8b62_4371_a4e0_dd65fdcf0a43.slice/crio-bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544 WatchSource:0}: Error finding container bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544: Status 404 returned error can't find the container with id bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544 Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.013643 4764 generic.go:334] "Generic (PLEG): container finished" podID="ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" containerID="c9abad1422dc9ad9e075587f1ca11853ab13a419b0379735f34274780f74e6ba" exitCode=0 Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.013783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb72-account-create-update-6hb8g" event={"ID":"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43","Type":"ContainerDied","Data":"c9abad1422dc9ad9e075587f1ca11853ab13a419b0379735f34274780f74e6ba"} Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.013819 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb72-account-create-update-6hb8g" event={"ID":"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43","Type":"ContainerStarted","Data":"bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544"} Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.016267 4764 generic.go:334] "Generic (PLEG): container finished" podID="6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" containerID="f0d36fc378dec5fcca5b3624ff3e72903cfc690d0a49c4b4448161f64850cb8c" exitCode=0 Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.016315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m9pr5" event={"ID":"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f","Type":"ContainerDied","Data":"f0d36fc378dec5fcca5b3624ff3e72903cfc690d0a49c4b4448161f64850cb8c"} Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.016342 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m9pr5" event={"ID":"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f","Type":"ContainerStarted","Data":"7c86911728d4e5c78cde88be74b1d830d6318c16d5e7a7161ff9020d043fcbca"} Dec 04 01:13:12 crc kubenswrapper[4764]: I1204 01:13:12.546223 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:13:12 crc kubenswrapper[4764]: E1204 01:13:12.547005 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.552247 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.562065 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.570688 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzbk\" (UniqueName: \"kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk\") pod \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.570793 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts\") pod \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\" (UID: \"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f\") " Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.571111 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwn5f\" (UniqueName: \"kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f\") pod \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.571291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts\") pod \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\" (UID: \"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43\") " Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.571793 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" (UID: "6bf22e73-1e49-40e3-b33e-c0b5b9391f2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.571850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" (UID: "ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.578953 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f" (OuterVolumeSpecName: "kube-api-access-cwn5f") pod "ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" (UID: "ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43"). InnerVolumeSpecName "kube-api-access-cwn5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.583962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk" (OuterVolumeSpecName: "kube-api-access-ztzbk") pod "6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" (UID: "6bf22e73-1e49-40e3-b33e-c0b5b9391f2f"). InnerVolumeSpecName "kube-api-access-ztzbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.674578 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzbk\" (UniqueName: \"kubernetes.io/projected/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-kube-api-access-ztzbk\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.674609 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.674619 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwn5f\" (UniqueName: \"kubernetes.io/projected/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-kube-api-access-cwn5f\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:13 crc kubenswrapper[4764]: I1204 01:13:13.674628 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.040399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m9pr5" event={"ID":"6bf22e73-1e49-40e3-b33e-c0b5b9391f2f","Type":"ContainerDied","Data":"7c86911728d4e5c78cde88be74b1d830d6318c16d5e7a7161ff9020d043fcbca"} Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.040425 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m9pr5" Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.040445 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c86911728d4e5c78cde88be74b1d830d6318c16d5e7a7161ff9020d043fcbca" Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.042845 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb72-account-create-update-6hb8g" event={"ID":"ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43","Type":"ContainerDied","Data":"bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544"} Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.042922 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb0027c9ade7ab17bf3df03aa37ad57dcc6bdb21b80eeb0ad483e4ecd70be544" Dec 04 01:13:14 crc kubenswrapper[4764]: I1204 01:13:14.042957 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb72-account-create-update-6hb8g" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.021944 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:13:16 crc kubenswrapper[4764]: E1204 01:13:16.022568 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" containerName="mariadb-account-create-update" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.022580 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" containerName="mariadb-account-create-update" Dec 04 01:13:16 crc kubenswrapper[4764]: E1204 01:13:16.022593 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" containerName="mariadb-database-create" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.022599 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" containerName="mariadb-database-create" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.022774 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" containerName="mariadb-database-create" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.022785 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" containerName="mariadb-account-create-update" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.023657 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.029863 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.042343 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.042473 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.042514 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.042690 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6ll\" (UniqueName: \"kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.042751 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.077953 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-pzdx2"] Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.079276 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.081190 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.081604 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8jc8x" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.081820 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.086929 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pzdx2"] Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143627 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6ll\" (UniqueName: \"kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143877 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143930 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxppc\" (UniqueName: \"kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.143956 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.144654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.145370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.145374 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.145847 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.161752 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6ll\" (UniqueName: \"kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll\") pod \"dnsmasq-dns-5c95ff496c-ntszq\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.244946 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.244990 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.245021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxppc\" (UniqueName: \"kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.245064 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.245110 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.245660 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.248622 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.251602 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.254369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.263295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxppc\" (UniqueName: \"kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc\") pod \"placement-db-sync-pzdx2\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.341621 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.398618 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.777080 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:13:16 crc kubenswrapper[4764]: I1204 01:13:16.917020 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-pzdx2"] Dec 04 01:13:17 crc kubenswrapper[4764]: I1204 01:13:17.071043 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzdx2" event={"ID":"463f5236-7838-4635-a52b-ca2a2ef4f477","Type":"ContainerStarted","Data":"da82f0d0b0ce785128294d8301fab814ead20f0f362ef6d89614db667fcff47a"} Dec 04 01:13:17 crc kubenswrapper[4764]: I1204 01:13:17.074469 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerID="d18228b4af3f21059dc9de2c989e2945ec2eed3c6c01a8e1241f107e3a22092f" exitCode=0 Dec 04 01:13:17 crc kubenswrapper[4764]: I1204 01:13:17.074508 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" event={"ID":"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4","Type":"ContainerDied","Data":"d18228b4af3f21059dc9de2c989e2945ec2eed3c6c01a8e1241f107e3a22092f"} Dec 04 01:13:17 crc kubenswrapper[4764]: I1204 01:13:17.074528 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" event={"ID":"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4","Type":"ContainerStarted","Data":"e6c2d6ef6853ea0d3c809a9231f09338967100e6faa18a7d37b204b37124ed56"} Dec 04 01:13:18 crc kubenswrapper[4764]: I1204 01:13:18.085993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzdx2" event={"ID":"463f5236-7838-4635-a52b-ca2a2ef4f477","Type":"ContainerStarted","Data":"b261d11aa6316127130d3b54b69517d2eb7813fa16e9ce8b367b96a04c3462ed"} Dec 04 01:13:18 crc kubenswrapper[4764]: I1204 01:13:18.089529 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" event={"ID":"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4","Type":"ContainerStarted","Data":"86a404821c1b0b4c38d049838203bad535961ecbb664ad5078e75aeb0e192169"} Dec 04 01:13:18 crc kubenswrapper[4764]: I1204 01:13:18.089925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:18 crc kubenswrapper[4764]: I1204 01:13:18.107188 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-pzdx2" podStartSLOduration=2.107170104 podStartE2EDuration="2.107170104s" podCreationTimestamp="2025-12-04 01:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:13:18.102819387 +0000 UTC m=+5533.864143798" watchObservedRunningTime="2025-12-04 01:13:18.107170104 +0000 UTC m=+5533.868494515" Dec 04 01:13:18 crc kubenswrapper[4764]: I1204 01:13:18.130139 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" podStartSLOduration=3.130120769 podStartE2EDuration="3.130120769s" podCreationTimestamp="2025-12-04 01:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:13:18.120530013 +0000 UTC m=+5533.881854454" watchObservedRunningTime="2025-12-04 01:13:18.130120769 +0000 UTC m=+5533.891445180" Dec 04 01:13:19 crc kubenswrapper[4764]: I1204 01:13:19.106695 4764 generic.go:334] "Generic (PLEG): container finished" podID="463f5236-7838-4635-a52b-ca2a2ef4f477" containerID="b261d11aa6316127130d3b54b69517d2eb7813fa16e9ce8b367b96a04c3462ed" exitCode=0 Dec 04 01:13:19 crc kubenswrapper[4764]: I1204 01:13:19.106826 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzdx2" event={"ID":"463f5236-7838-4635-a52b-ca2a2ef4f477","Type":"ContainerDied","Data":"b261d11aa6316127130d3b54b69517d2eb7813fa16e9ce8b367b96a04c3462ed"} Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.553246 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.726385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxppc\" (UniqueName: \"kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc\") pod \"463f5236-7838-4635-a52b-ca2a2ef4f477\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.726445 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data\") pod \"463f5236-7838-4635-a52b-ca2a2ef4f477\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.726498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs\") pod \"463f5236-7838-4635-a52b-ca2a2ef4f477\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.726655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle\") pod \"463f5236-7838-4635-a52b-ca2a2ef4f477\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.726737 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts\") pod \"463f5236-7838-4635-a52b-ca2a2ef4f477\" (UID: \"463f5236-7838-4635-a52b-ca2a2ef4f477\") " Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.727627 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs" (OuterVolumeSpecName: "logs") pod "463f5236-7838-4635-a52b-ca2a2ef4f477" (UID: "463f5236-7838-4635-a52b-ca2a2ef4f477"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.734119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts" (OuterVolumeSpecName: "scripts") pod "463f5236-7838-4635-a52b-ca2a2ef4f477" (UID: "463f5236-7838-4635-a52b-ca2a2ef4f477"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.736048 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc" (OuterVolumeSpecName: "kube-api-access-cxppc") pod "463f5236-7838-4635-a52b-ca2a2ef4f477" (UID: "463f5236-7838-4635-a52b-ca2a2ef4f477"). InnerVolumeSpecName "kube-api-access-cxppc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.756312 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "463f5236-7838-4635-a52b-ca2a2ef4f477" (UID: "463f5236-7838-4635-a52b-ca2a2ef4f477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.768787 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data" (OuterVolumeSpecName: "config-data") pod "463f5236-7838-4635-a52b-ca2a2ef4f477" (UID: "463f5236-7838-4635-a52b-ca2a2ef4f477"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.829702 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.829779 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxppc\" (UniqueName: \"kubernetes.io/projected/463f5236-7838-4635-a52b-ca2a2ef4f477-kube-api-access-cxppc\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.829798 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.829816 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/463f5236-7838-4635-a52b-ca2a2ef4f477-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:20 crc kubenswrapper[4764]: I1204 01:13:20.829833 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/463f5236-7838-4635-a52b-ca2a2ef4f477-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.155795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-pzdx2" event={"ID":"463f5236-7838-4635-a52b-ca2a2ef4f477","Type":"ContainerDied","Data":"da82f0d0b0ce785128294d8301fab814ead20f0f362ef6d89614db667fcff47a"} Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.155837 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da82f0d0b0ce785128294d8301fab814ead20f0f362ef6d89614db667fcff47a" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.155866 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-pzdx2" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.229459 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54c9bb54b6-85qmh"] Dec 04 01:13:21 crc kubenswrapper[4764]: E1204 01:13:21.229866 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="463f5236-7838-4635-a52b-ca2a2ef4f477" containerName="placement-db-sync" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.229886 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="463f5236-7838-4635-a52b-ca2a2ef4f477" containerName="placement-db-sync" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.230113 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="463f5236-7838-4635-a52b-ca2a2ef4f477" containerName="placement-db-sync" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.231223 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.233711 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-8jc8x" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.233783 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.235585 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.244802 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54c9bb54b6-85qmh"] Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.339147 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-scripts\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.339199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-logs\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.339266 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5h6f\" (UniqueName: \"kubernetes.io/projected/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-kube-api-access-s5h6f\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.339324 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-config-data\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.339372 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-combined-ca-bundle\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.440788 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5h6f\" (UniqueName: \"kubernetes.io/projected/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-kube-api-access-s5h6f\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.440851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-config-data\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.440893 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-combined-ca-bundle\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.440959 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-scripts\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.440977 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-logs\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.441335 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-logs\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.444951 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-scripts\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.445311 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-combined-ca-bundle\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.448043 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-config-data\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.460964 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5h6f\" (UniqueName: \"kubernetes.io/projected/7847c079-88f3-4ae0-a4b9-666c82f8b8a6-kube-api-access-s5h6f\") pod \"placement-54c9bb54b6-85qmh\" (UID: \"7847c079-88f3-4ae0-a4b9-666c82f8b8a6\") " pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:21 crc kubenswrapper[4764]: I1204 01:13:21.562597 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:22 crc kubenswrapper[4764]: I1204 01:13:22.114596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54c9bb54b6-85qmh"] Dec 04 01:13:22 crc kubenswrapper[4764]: W1204 01:13:22.134492 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7847c079_88f3_4ae0_a4b9_666c82f8b8a6.slice/crio-22815f0540a6e56c21ae2ae30b8b41a266527230f6100776a18a663bc13ca441 WatchSource:0}: Error finding container 22815f0540a6e56c21ae2ae30b8b41a266527230f6100776a18a663bc13ca441: Status 404 returned error can't find the container with id 22815f0540a6e56c21ae2ae30b8b41a266527230f6100776a18a663bc13ca441 Dec 04 01:13:22 crc kubenswrapper[4764]: I1204 01:13:22.165170 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c9bb54b6-85qmh" event={"ID":"7847c079-88f3-4ae0-a4b9-666c82f8b8a6","Type":"ContainerStarted","Data":"22815f0540a6e56c21ae2ae30b8b41a266527230f6100776a18a663bc13ca441"} Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.177583 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c9bb54b6-85qmh" event={"ID":"7847c079-88f3-4ae0-a4b9-666c82f8b8a6","Type":"ContainerStarted","Data":"f0a812b373c24020ea1200761be91bbfc73d8cc4b48945239412866f695d22b2"} Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.177972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c9bb54b6-85qmh" event={"ID":"7847c079-88f3-4ae0-a4b9-666c82f8b8a6","Type":"ContainerStarted","Data":"83de939d51e37b68da9b92693a2b6a708c666fb0a802fd5f35a652414b03d171"} Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.178002 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.178022 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.225166 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54c9bb54b6-85qmh" podStartSLOduration=2.22513983 podStartE2EDuration="2.22513983s" podCreationTimestamp="2025-12-04 01:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:13:23.208563392 +0000 UTC m=+5538.969887843" watchObservedRunningTime="2025-12-04 01:13:23.22513983 +0000 UTC m=+5538.986464281" Dec 04 01:13:23 crc kubenswrapper[4764]: I1204 01:13:23.545639 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:13:23 crc kubenswrapper[4764]: E1204 01:13:23.546530 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.342932 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.409413 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.409731 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="dnsmasq-dns" containerID="cri-o://925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723" gracePeriod=10 Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.862140 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.961209 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq4b5\" (UniqueName: \"kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5\") pod \"0ba57644-78d0-46d0-aae7-98efa9ce7465\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.961348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb\") pod \"0ba57644-78d0-46d0-aae7-98efa9ce7465\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.961384 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc\") pod \"0ba57644-78d0-46d0-aae7-98efa9ce7465\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.961463 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config\") pod \"0ba57644-78d0-46d0-aae7-98efa9ce7465\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.961486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb\") pod \"0ba57644-78d0-46d0-aae7-98efa9ce7465\" (UID: \"0ba57644-78d0-46d0-aae7-98efa9ce7465\") " Dec 04 01:13:26 crc kubenswrapper[4764]: I1204 01:13:26.967054 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5" (OuterVolumeSpecName: "kube-api-access-hq4b5") pod "0ba57644-78d0-46d0-aae7-98efa9ce7465" (UID: "0ba57644-78d0-46d0-aae7-98efa9ce7465"). InnerVolumeSpecName "kube-api-access-hq4b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.015097 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ba57644-78d0-46d0-aae7-98efa9ce7465" (UID: "0ba57644-78d0-46d0-aae7-98efa9ce7465"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.015831 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ba57644-78d0-46d0-aae7-98efa9ce7465" (UID: "0ba57644-78d0-46d0-aae7-98efa9ce7465"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.021460 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ba57644-78d0-46d0-aae7-98efa9ce7465" (UID: "0ba57644-78d0-46d0-aae7-98efa9ce7465"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.029676 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config" (OuterVolumeSpecName: "config") pod "0ba57644-78d0-46d0-aae7-98efa9ce7465" (UID: "0ba57644-78d0-46d0-aae7-98efa9ce7465"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.063609 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq4b5\" (UniqueName: \"kubernetes.io/projected/0ba57644-78d0-46d0-aae7-98efa9ce7465-kube-api-access-hq4b5\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.063661 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.063691 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.063709 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.063748 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ba57644-78d0-46d0-aae7-98efa9ce7465-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.216734 4764 generic.go:334] "Generic (PLEG): container finished" podID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerID="925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723" exitCode=0 Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.216788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" event={"ID":"0ba57644-78d0-46d0-aae7-98efa9ce7465","Type":"ContainerDied","Data":"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723"} Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.216802 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.216828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c579fb595-gwnwm" event={"ID":"0ba57644-78d0-46d0-aae7-98efa9ce7465","Type":"ContainerDied","Data":"952efe510501ef897f99d23ce55148f6a9e5bcfb6c5936c1edff771057976212"} Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.216851 4764 scope.go:117] "RemoveContainer" containerID="925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.237579 4764 scope.go:117] "RemoveContainer" containerID="f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.253558 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.260389 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c579fb595-gwnwm"] Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.275239 4764 scope.go:117] "RemoveContainer" containerID="925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723" Dec 04 01:13:27 crc kubenswrapper[4764]: E1204 01:13:27.275623 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723\": container with ID starting with 925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723 not found: ID does not exist" containerID="925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.275652 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723"} err="failed to get container status \"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723\": rpc error: code = NotFound desc = could not find container \"925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723\": container with ID starting with 925addc8e1dd7f4c4725f8ed1c3b7d4398ecc04149d61300dc9690683760d723 not found: ID does not exist" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.275673 4764 scope.go:117] "RemoveContainer" containerID="f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c" Dec 04 01:13:27 crc kubenswrapper[4764]: E1204 01:13:27.276073 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c\": container with ID starting with f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c not found: ID does not exist" containerID="f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c" Dec 04 01:13:27 crc kubenswrapper[4764]: I1204 01:13:27.276102 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c"} err="failed to get container status \"f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c\": rpc error: code = NotFound desc = could not find container \"f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c\": container with ID starting with f02159b94c0d48104e669cdf97947d12df4593469457931dc186d0edd6b9045c not found: ID does not exist" Dec 04 01:13:28 crc kubenswrapper[4764]: I1204 01:13:28.565984 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" path="/var/lib/kubelet/pods/0ba57644-78d0-46d0-aae7-98efa9ce7465/volumes" Dec 04 01:13:36 crc kubenswrapper[4764]: I1204 01:13:36.546462 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:13:36 crc kubenswrapper[4764]: E1204 01:13:36.547754 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:13:49 crc kubenswrapper[4764]: I1204 01:13:49.547451 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:13:49 crc kubenswrapper[4764]: E1204 01:13:49.548839 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:13:52 crc kubenswrapper[4764]: I1204 01:13:52.538098 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:13:53 crc kubenswrapper[4764]: I1204 01:13:53.603860 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54c9bb54b6-85qmh" Dec 04 01:14:00 crc kubenswrapper[4764]: I1204 01:14:00.547524 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:14:00 crc kubenswrapper[4764]: E1204 01:14:00.548780 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:14:11 crc kubenswrapper[4764]: I1204 01:14:11.147411 4764 scope.go:117] "RemoveContainer" containerID="9bb5a845ff8875f101d463d2ec1eb9d7451f22c737c763b643421531dfba0ddd" Dec 04 01:14:11 crc kubenswrapper[4764]: I1204 01:14:11.181679 4764 scope.go:117] "RemoveContainer" containerID="b03d1bc808bf3cfeb33c2dcb642e3fb25e640c57fef08dec687040dd1b7dee16" Dec 04 01:14:15 crc kubenswrapper[4764]: I1204 01:14:15.546174 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:14:15 crc kubenswrapper[4764]: E1204 01:14:15.547345 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.503509 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-drt66"] Dec 04 01:14:17 crc kubenswrapper[4764]: E1204 01:14:17.504008 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="init" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.504026 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="init" Dec 04 01:14:17 crc kubenswrapper[4764]: E1204 01:14:17.504074 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="dnsmasq-dns" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.504081 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="dnsmasq-dns" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.504278 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ba57644-78d0-46d0-aae7-98efa9ce7465" containerName="dnsmasq-dns" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.504885 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.514294 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-drt66"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.594870 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qdtmh"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.595975 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.611803 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qdtmh"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.639231 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.639298 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bvdf\" (UniqueName: \"kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.701124 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cccc-account-create-update-2rbt2"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.702389 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.704015 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.713531 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cccc-account-create-update-2rbt2"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.740469 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.740543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bvdf\" (UniqueName: \"kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.740634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfnrf\" (UniqueName: \"kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.740676 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.741304 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.759437 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bvdf\" (UniqueName: \"kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf\") pod \"nova-api-db-create-drt66\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.798088 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-xpp64"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.808938 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.810953 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xpp64"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.819611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.842672 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfnrf\" (UniqueName: \"kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.842789 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.842826 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.842873 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4999\" (UniqueName: \"kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.843609 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.862421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfnrf\" (UniqueName: \"kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf\") pod \"nova-cell0-db-create-qdtmh\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.909074 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.942780 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7cb3-account-create-update-k4bwd"] Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.943896 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.945448 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.945501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qbbt\" (UniqueName: \"kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.945574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.945632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4999\" (UniqueName: \"kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.946459 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.946634 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.973272 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4999\" (UniqueName: \"kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999\") pod \"nova-api-cccc-account-create-update-2rbt2\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:17 crc kubenswrapper[4764]: I1204 01:14:17.991300 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7cb3-account-create-update-k4bwd"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.019081 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.048347 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.048422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.048470 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7rj\" (UniqueName: \"kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.048493 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qbbt\" (UniqueName: \"kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.049308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.090126 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qbbt\" (UniqueName: \"kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt\") pod \"nova-cell1-db-create-xpp64\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.137029 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.141812 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-937f-account-create-update-k224b"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.145229 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.147597 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.149607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.149705 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7rj\" (UniqueName: \"kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.150323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.218691 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-937f-account-create-update-k224b"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.251796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7rj\" (UniqueName: \"kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj\") pod \"nova-cell0-7cb3-account-create-update-k4bwd\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.251816 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.252391 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ww55\" (UniqueName: \"kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.354231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ww55\" (UniqueName: \"kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.354374 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.355264 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.359288 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.376149 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ww55\" (UniqueName: \"kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55\") pod \"nova-cell1-937f-account-create-update-k224b\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.554920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.622835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qdtmh"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.743547 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-drt66"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.769110 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-xpp64"] Dec 04 01:14:18 crc kubenswrapper[4764]: W1204 01:14:18.784925 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33377c8a_9ea1_440b_aac0_8492e36ae5d3.slice/crio-2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce WatchSource:0}: Error finding container 2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce: Status 404 returned error can't find the container with id 2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.790550 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cccc-account-create-update-2rbt2"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.799681 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-937f-account-create-update-k224b"] Dec 04 01:14:18 crc kubenswrapper[4764]: W1204 01:14:18.814954 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c7be3d_f739_4ef4_9a96_e5ab3c3fb3b4.slice/crio-16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b WatchSource:0}: Error finding container 16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b: Status 404 returned error can't find the container with id 16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.881075 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7cb3-account-create-update-k4bwd"] Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.883448 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xpp64" event={"ID":"386aa79c-4a08-4154-896d-f7be1f444951","Type":"ContainerStarted","Data":"383616d576ad9f2818f8801fdc69c990445e9d2ec67b2373b190ce4ca5ec8d9d"} Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.885210 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cccc-account-create-update-2rbt2" event={"ID":"33377c8a-9ea1-440b-aac0-8492e36ae5d3","Type":"ContainerStarted","Data":"2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce"} Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.886852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-drt66" event={"ID":"a7003749-8986-4904-acbf-32390dab7600","Type":"ContainerStarted","Data":"fac3efa7af43e8b49ba050effbef14a48bc16297b14a8ae8bc1f222c69152c18"} Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.888463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qdtmh" event={"ID":"07e40f27-93e1-44ed-8887-e7ba539f4270","Type":"ContainerStarted","Data":"bbe039d126df8b31efc3b1b4e3b157753e80b5024142710df9c3f1459445e511"} Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.888493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qdtmh" event={"ID":"07e40f27-93e1-44ed-8887-e7ba539f4270","Type":"ContainerStarted","Data":"554d224dd22f80fd2df97fb0cda2673c3bfa4661cc0fb24888271606887ed731"} Dec 04 01:14:18 crc kubenswrapper[4764]: W1204 01:14:18.895338 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod015a0994_0aee_4ed7_a588_ed74a8cb5db3.slice/crio-626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d WatchSource:0}: Error finding container 626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d: Status 404 returned error can't find the container with id 626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.897075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-937f-account-create-update-k224b" event={"ID":"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4","Type":"ContainerStarted","Data":"16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b"} Dec 04 01:14:18 crc kubenswrapper[4764]: I1204 01:14:18.914093 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-qdtmh" podStartSLOduration=1.914075656 podStartE2EDuration="1.914075656s" podCreationTimestamp="2025-12-04 01:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:18.910126119 +0000 UTC m=+5594.671450530" watchObservedRunningTime="2025-12-04 01:14:18.914075656 +0000 UTC m=+5594.675400067" Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.907742 4764 generic.go:334] "Generic (PLEG): container finished" podID="07e40f27-93e1-44ed-8887-e7ba539f4270" containerID="bbe039d126df8b31efc3b1b4e3b157753e80b5024142710df9c3f1459445e511" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.907874 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qdtmh" event={"ID":"07e40f27-93e1-44ed-8887-e7ba539f4270","Type":"ContainerDied","Data":"bbe039d126df8b31efc3b1b4e3b157753e80b5024142710df9c3f1459445e511"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.909617 4764 generic.go:334] "Generic (PLEG): container finished" podID="67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" containerID="705b2350661e0762320380e7d505a3762070a7cf9a53e7ec08c3c3910a996d5a" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.909661 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-937f-account-create-update-k224b" event={"ID":"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4","Type":"ContainerDied","Data":"705b2350661e0762320380e7d505a3762070a7cf9a53e7ec08c3c3910a996d5a"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.911142 4764 generic.go:334] "Generic (PLEG): container finished" podID="015a0994-0aee-4ed7-a588-ed74a8cb5db3" containerID="8fafaaf760b2c10aa3408f55dde7cdee94f47ee862e0e7bb54ee73040a297220" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.911269 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" event={"ID":"015a0994-0aee-4ed7-a588-ed74a8cb5db3","Type":"ContainerDied","Data":"8fafaaf760b2c10aa3408f55dde7cdee94f47ee862e0e7bb54ee73040a297220"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.911295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" event={"ID":"015a0994-0aee-4ed7-a588-ed74a8cb5db3","Type":"ContainerStarted","Data":"626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.913913 4764 generic.go:334] "Generic (PLEG): container finished" podID="386aa79c-4a08-4154-896d-f7be1f444951" containerID="2197dd50300f9109e0937455d333d6ed1bd8d75784bda88be80d7a553356c765" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.913959 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xpp64" event={"ID":"386aa79c-4a08-4154-896d-f7be1f444951","Type":"ContainerDied","Data":"2197dd50300f9109e0937455d333d6ed1bd8d75784bda88be80d7a553356c765"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.915629 4764 generic.go:334] "Generic (PLEG): container finished" podID="33377c8a-9ea1-440b-aac0-8492e36ae5d3" containerID="7474feb2c708dcc2dacb2db55823dbab40d98e321efee67abd49022edbcce363" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.915678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cccc-account-create-update-2rbt2" event={"ID":"33377c8a-9ea1-440b-aac0-8492e36ae5d3","Type":"ContainerDied","Data":"7474feb2c708dcc2dacb2db55823dbab40d98e321efee67abd49022edbcce363"} Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.935026 4764 generic.go:334] "Generic (PLEG): container finished" podID="a7003749-8986-4904-acbf-32390dab7600" containerID="5347af05e7926ebac84bc843569b67c8ace22df2bd1085ed302ee5efd82ee24b" exitCode=0 Dec 04 01:14:19 crc kubenswrapper[4764]: I1204 01:14:19.935091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-drt66" event={"ID":"a7003749-8986-4904-acbf-32390dab7600","Type":"ContainerDied","Data":"5347af05e7926ebac84bc843569b67c8ace22df2bd1085ed302ee5efd82ee24b"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.341323 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.498013 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.504085 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.510137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.518346 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bvdf\" (UniqueName: \"kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf\") pod \"a7003749-8986-4904-acbf-32390dab7600\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.518465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts\") pod \"a7003749-8986-4904-acbf-32390dab7600\" (UID: \"a7003749-8986-4904-acbf-32390dab7600\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.519201 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7003749-8986-4904-acbf-32390dab7600" (UID: "a7003749-8986-4904-acbf-32390dab7600"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.560458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf" (OuterVolumeSpecName: "kube-api-access-6bvdf") pod "a7003749-8986-4904-acbf-32390dab7600" (UID: "a7003749-8986-4904-acbf-32390dab7600"). InnerVolumeSpecName "kube-api-access-6bvdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620423 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts\") pod \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620476 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts\") pod \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620552 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts\") pod \"386aa79c-4a08-4154-896d-f7be1f444951\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4999\" (UniqueName: \"kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999\") pod \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\" (UID: \"33377c8a-9ea1-440b-aac0-8492e36ae5d3\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620754 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7rj\" (UniqueName: \"kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj\") pod \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\" (UID: \"015a0994-0aee-4ed7-a588-ed74a8cb5db3\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620777 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qbbt\" (UniqueName: \"kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt\") pod \"386aa79c-4a08-4154-896d-f7be1f444951\" (UID: \"386aa79c-4a08-4154-896d-f7be1f444951\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.620986 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33377c8a-9ea1-440b-aac0-8492e36ae5d3" (UID: "33377c8a-9ea1-440b-aac0-8492e36ae5d3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.621457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "015a0994-0aee-4ed7-a588-ed74a8cb5db3" (UID: "015a0994-0aee-4ed7-a588-ed74a8cb5db3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.621452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "386aa79c-4a08-4154-896d-f7be1f444951" (UID: "386aa79c-4a08-4154-896d-f7be1f444951"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.621536 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7003749-8986-4904-acbf-32390dab7600-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.621568 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bvdf\" (UniqueName: \"kubernetes.io/projected/a7003749-8986-4904-acbf-32390dab7600-kube-api-access-6bvdf\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.621582 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33377c8a-9ea1-440b-aac0-8492e36ae5d3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.623877 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj" (OuterVolumeSpecName: "kube-api-access-nf7rj") pod "015a0994-0aee-4ed7-a588-ed74a8cb5db3" (UID: "015a0994-0aee-4ed7-a588-ed74a8cb5db3"). InnerVolumeSpecName "kube-api-access-nf7rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.624572 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999" (OuterVolumeSpecName: "kube-api-access-x4999") pod "33377c8a-9ea1-440b-aac0-8492e36ae5d3" (UID: "33377c8a-9ea1-440b-aac0-8492e36ae5d3"). InnerVolumeSpecName "kube-api-access-x4999". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.624974 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt" (OuterVolumeSpecName: "kube-api-access-6qbbt") pod "386aa79c-4a08-4154-896d-f7be1f444951" (UID: "386aa79c-4a08-4154-896d-f7be1f444951"). InnerVolumeSpecName "kube-api-access-6qbbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.625037 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.636839 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.722687 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ww55\" (UniqueName: \"kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55\") pod \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.722970 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts\") pod \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\" (UID: \"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.723357 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" (UID: "67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.724955 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/386aa79c-4a08-4154-896d-f7be1f444951-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.724989 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.725001 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4999\" (UniqueName: \"kubernetes.io/projected/33377c8a-9ea1-440b-aac0-8492e36ae5d3-kube-api-access-x4999\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.725012 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7rj\" (UniqueName: \"kubernetes.io/projected/015a0994-0aee-4ed7-a588-ed74a8cb5db3-kube-api-access-nf7rj\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.725020 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qbbt\" (UniqueName: \"kubernetes.io/projected/386aa79c-4a08-4154-896d-f7be1f444951-kube-api-access-6qbbt\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.725029 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015a0994-0aee-4ed7-a588-ed74a8cb5db3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.726635 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55" (OuterVolumeSpecName: "kube-api-access-9ww55") pod "67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" (UID: "67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4"). InnerVolumeSpecName "kube-api-access-9ww55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.825731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfnrf\" (UniqueName: \"kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf\") pod \"07e40f27-93e1-44ed-8887-e7ba539f4270\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.825803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts\") pod \"07e40f27-93e1-44ed-8887-e7ba539f4270\" (UID: \"07e40f27-93e1-44ed-8887-e7ba539f4270\") " Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.826547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "07e40f27-93e1-44ed-8887-e7ba539f4270" (UID: "07e40f27-93e1-44ed-8887-e7ba539f4270"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.826653 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ww55\" (UniqueName: \"kubernetes.io/projected/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4-kube-api-access-9ww55\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.829862 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf" (OuterVolumeSpecName: "kube-api-access-gfnrf") pod "07e40f27-93e1-44ed-8887-e7ba539f4270" (UID: "07e40f27-93e1-44ed-8887-e7ba539f4270"). InnerVolumeSpecName "kube-api-access-gfnrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.928382 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfnrf\" (UniqueName: \"kubernetes.io/projected/07e40f27-93e1-44ed-8887-e7ba539f4270-kube-api-access-gfnrf\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.928414 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/07e40f27-93e1-44ed-8887-e7ba539f4270-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.952356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qdtmh" event={"ID":"07e40f27-93e1-44ed-8887-e7ba539f4270","Type":"ContainerDied","Data":"554d224dd22f80fd2df97fb0cda2673c3bfa4661cc0fb24888271606887ed731"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.952386 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qdtmh" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.952407 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="554d224dd22f80fd2df97fb0cda2673c3bfa4661cc0fb24888271606887ed731" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.954091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-937f-account-create-update-k224b" event={"ID":"67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4","Type":"ContainerDied","Data":"16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.954114 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16fb560363ab9d96640bc8baf0f5c5957ed745fda7b133fdde4d4508674ffe8b" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.954172 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-937f-account-create-update-k224b" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.965961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" event={"ID":"015a0994-0aee-4ed7-a588-ed74a8cb5db3","Type":"ContainerDied","Data":"626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.965990 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="626a4e845ad33f6af483c4b9a71582b63a029df4a05a49a08825767302b5533d" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.966030 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7cb3-account-create-update-k4bwd" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.967898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-xpp64" event={"ID":"386aa79c-4a08-4154-896d-f7be1f444951","Type":"ContainerDied","Data":"383616d576ad9f2818f8801fdc69c990445e9d2ec67b2373b190ce4ca5ec8d9d"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.967919 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="383616d576ad9f2818f8801fdc69c990445e9d2ec67b2373b190ce4ca5ec8d9d" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.967961 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-xpp64" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.969774 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cccc-account-create-update-2rbt2" event={"ID":"33377c8a-9ea1-440b-aac0-8492e36ae5d3","Type":"ContainerDied","Data":"2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.969795 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2925bf8f386252d0a51ae8593d6ca291fa485ec30ec647d812892851267e09ce" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.969839 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cccc-account-create-update-2rbt2" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.977499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-drt66" event={"ID":"a7003749-8986-4904-acbf-32390dab7600","Type":"ContainerDied","Data":"fac3efa7af43e8b49ba050effbef14a48bc16297b14a8ae8bc1f222c69152c18"} Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.977811 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac3efa7af43e8b49ba050effbef14a48bc16297b14a8ae8bc1f222c69152c18" Dec 04 01:14:21 crc kubenswrapper[4764]: I1204 01:14:21.977539 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-drt66" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082094 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n9cc8"] Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082820 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33377c8a-9ea1-440b-aac0-8492e36ae5d3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082836 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="33377c8a-9ea1-440b-aac0-8492e36ae5d3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082859 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7003749-8986-4904-acbf-32390dab7600" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082866 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7003749-8986-4904-acbf-32390dab7600" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082889 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082898 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082911 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386aa79c-4a08-4154-896d-f7be1f444951" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082918 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="386aa79c-4a08-4154-896d-f7be1f444951" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082931 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e40f27-93e1-44ed-8887-e7ba539f4270" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082938 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e40f27-93e1-44ed-8887-e7ba539f4270" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: E1204 01:14:23.082954 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015a0994-0aee-4ed7-a588-ed74a8cb5db3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.082961 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="015a0994-0aee-4ed7-a588-ed74a8cb5db3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083157 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7003749-8986-4904-acbf-32390dab7600" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083176 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="386aa79c-4a08-4154-896d-f7be1f444951" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083188 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="33377c8a-9ea1-440b-aac0-8492e36ae5d3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083201 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e40f27-93e1-44ed-8887-e7ba539f4270" containerName="mariadb-database-create" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083213 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083228 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="015a0994-0aee-4ed7-a588-ed74a8cb5db3" containerName="mariadb-account-create-update" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.083978 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.087020 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.087195 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.087366 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q8g8v" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.095089 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n9cc8"] Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.249861 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.249938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.249974 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chlk8\" (UniqueName: \"kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.250137 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.352442 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.352523 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chlk8\" (UniqueName: \"kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.352574 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.352665 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.356884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.357571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.365462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.368670 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chlk8\" (UniqueName: \"kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8\") pod \"nova-cell0-conductor-db-sync-n9cc8\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.403517 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:23 crc kubenswrapper[4764]: I1204 01:14:23.922710 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n9cc8"] Dec 04 01:14:24 crc kubenswrapper[4764]: I1204 01:14:24.000606 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" event={"ID":"b6db3b0c-354c-430a-aa30-9c1a14a3c540","Type":"ContainerStarted","Data":"8806afcd54f9c1cb1e1e076e63c24ffdd02166e381d311b35c4c77c63154c177"} Dec 04 01:14:25 crc kubenswrapper[4764]: I1204 01:14:25.108252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" event={"ID":"b6db3b0c-354c-430a-aa30-9c1a14a3c540","Type":"ContainerStarted","Data":"d1641ec5aa57f03d35c514443cdbca3c3414a019b023d7d63f9cf5f82c174245"} Dec 04 01:14:25 crc kubenswrapper[4764]: I1204 01:14:25.153760 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" podStartSLOduration=2.153741765 podStartE2EDuration="2.153741765s" podCreationTimestamp="2025-12-04 01:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:25.136195413 +0000 UTC m=+5600.897519824" watchObservedRunningTime="2025-12-04 01:14:25.153741765 +0000 UTC m=+5600.915066176" Dec 04 01:14:29 crc kubenswrapper[4764]: I1204 01:14:29.546104 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:14:30 crc kubenswrapper[4764]: I1204 01:14:30.168421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274"} Dec 04 01:14:30 crc kubenswrapper[4764]: I1204 01:14:30.170618 4764 generic.go:334] "Generic (PLEG): container finished" podID="b6db3b0c-354c-430a-aa30-9c1a14a3c540" containerID="d1641ec5aa57f03d35c514443cdbca3c3414a019b023d7d63f9cf5f82c174245" exitCode=0 Dec 04 01:14:30 crc kubenswrapper[4764]: I1204 01:14:30.170661 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" event={"ID":"b6db3b0c-354c-430a-aa30-9c1a14a3c540","Type":"ContainerDied","Data":"d1641ec5aa57f03d35c514443cdbca3c3414a019b023d7d63f9cf5f82c174245"} Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.599222 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.733023 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data\") pod \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.733084 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chlk8\" (UniqueName: \"kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8\") pod \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.733127 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle\") pod \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.733205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts\") pod \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\" (UID: \"b6db3b0c-354c-430a-aa30-9c1a14a3c540\") " Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.738684 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts" (OuterVolumeSpecName: "scripts") pod "b6db3b0c-354c-430a-aa30-9c1a14a3c540" (UID: "b6db3b0c-354c-430a-aa30-9c1a14a3c540"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.745614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8" (OuterVolumeSpecName: "kube-api-access-chlk8") pod "b6db3b0c-354c-430a-aa30-9c1a14a3c540" (UID: "b6db3b0c-354c-430a-aa30-9c1a14a3c540"). InnerVolumeSpecName "kube-api-access-chlk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.760433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6db3b0c-354c-430a-aa30-9c1a14a3c540" (UID: "b6db3b0c-354c-430a-aa30-9c1a14a3c540"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.760534 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data" (OuterVolumeSpecName: "config-data") pod "b6db3b0c-354c-430a-aa30-9c1a14a3c540" (UID: "b6db3b0c-354c-430a-aa30-9c1a14a3c540"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.835026 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chlk8\" (UniqueName: \"kubernetes.io/projected/b6db3b0c-354c-430a-aa30-9c1a14a3c540-kube-api-access-chlk8\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.835065 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.835078 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:31 crc kubenswrapper[4764]: I1204 01:14:31.835089 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6db3b0c-354c-430a-aa30-9c1a14a3c540-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.191208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" event={"ID":"b6db3b0c-354c-430a-aa30-9c1a14a3c540","Type":"ContainerDied","Data":"8806afcd54f9c1cb1e1e076e63c24ffdd02166e381d311b35c4c77c63154c177"} Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.191497 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8806afcd54f9c1cb1e1e076e63c24ffdd02166e381d311b35c4c77c63154c177" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.191271 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-n9cc8" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.297901 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:14:32 crc kubenswrapper[4764]: E1204 01:14:32.298236 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6db3b0c-354c-430a-aa30-9c1a14a3c540" containerName="nova-cell0-conductor-db-sync" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.298252 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6db3b0c-354c-430a-aa30-9c1a14a3c540" containerName="nova-cell0-conductor-db-sync" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.298429 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6db3b0c-354c-430a-aa30-9c1a14a3c540" containerName="nova-cell0-conductor-db-sync" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.298987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.301458 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q8g8v" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.311369 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.314869 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.343784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.343849 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.344159 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sm57\" (UniqueName: \"kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.446359 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sm57\" (UniqueName: \"kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.446424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.446462 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.451068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.451571 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.466683 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sm57\" (UniqueName: \"kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57\") pod \"nova-cell0-conductor-0\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:32 crc kubenswrapper[4764]: I1204 01:14:32.625904 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:33 crc kubenswrapper[4764]: I1204 01:14:33.113675 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:14:33 crc kubenswrapper[4764]: I1204 01:14:33.212145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73bd46f9-c2db-40fb-a8bd-bb922f14fef4","Type":"ContainerStarted","Data":"8b15e507be059cc55182b5ecccdf66e79f0c1aebecae3d40b9c1cbf4667a5586"} Dec 04 01:14:34 crc kubenswrapper[4764]: I1204 01:14:34.230825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73bd46f9-c2db-40fb-a8bd-bb922f14fef4","Type":"ContainerStarted","Data":"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f"} Dec 04 01:14:34 crc kubenswrapper[4764]: I1204 01:14:34.231906 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:34 crc kubenswrapper[4764]: I1204 01:14:34.269392 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.269332007 podStartE2EDuration="2.269332007s" podCreationTimestamp="2025-12-04 01:14:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:34.259123895 +0000 UTC m=+5610.020448336" watchObservedRunningTime="2025-12-04 01:14:34.269332007 +0000 UTC m=+5610.030656448" Dec 04 01:14:42 crc kubenswrapper[4764]: I1204 01:14:42.655570 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.232636 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-zldpr"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.233888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.237286 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.241700 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.249784 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zldpr"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.352625 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.352674 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wjt\" (UniqueName: \"kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.352702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.352790 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.355788 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.356857 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.360056 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.376868 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.443962 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.445111 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.448905 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.454646 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvhfw\" (UniqueName: \"kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.454761 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.454787 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wjt\" (UniqueName: \"kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.454881 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.454986 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.455025 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.455133 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.459028 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.467914 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.482792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.499778 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.542851 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.547969 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.557839 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wjt\" (UniqueName: \"kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt\") pod \"nova-cell0-cell-mapping-zldpr\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.558492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vtg\" (UniqueName: \"kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.558572 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.558607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.558636 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.558675 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.561348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvhfw\" (UniqueName: \"kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.562010 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.564360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.589413 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.606566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.606903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvhfw\" (UniqueName: \"kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw\") pod \"nova-cell1-novncproxy-0\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.617726 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.619258 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.622636 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.630435 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.666831 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vtg\" (UniqueName: \"kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.666904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.666927 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.666972 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.666997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.667015 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2vg\" (UniqueName: \"kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.667057 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.667600 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.671060 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.676351 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.683139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.683251 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vtg\" (UniqueName: \"kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg\") pod \"nova-scheduler-0\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.686110 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.690183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768558 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2vg\" (UniqueName: \"kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768585 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768611 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768661 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768679 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jqld\" (UniqueName: \"kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crjkz\" (UniqueName: \"kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.768868 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.770112 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.773654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.773743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.784254 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2vg\" (UniqueName: \"kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg\") pod \"nova-metadata-0\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.845296 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.856590 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jqld\" (UniqueName: \"kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870903 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870920 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crjkz\" (UniqueName: \"kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870980 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.870997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.872350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.873031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.873515 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.875543 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.876051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.877976 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.884332 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.888677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jqld\" (UniqueName: \"kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld\") pod \"dnsmasq-dns-f499f49d9-4dq6j\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.893259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crjkz\" (UniqueName: \"kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz\") pod \"nova-api-0\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " pod="openstack/nova-api-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.957751 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:43 crc kubenswrapper[4764]: I1204 01:14:43.969742 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.034331 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.182129 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.196139 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6cpft"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.197324 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.199445 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.199580 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.217497 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6cpft"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.279685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.280106 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.280152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.280227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmcsj\" (UniqueName: \"kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.323962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16b3743-1815-4473-84f5-0cb21a1bebee","Type":"ContainerStarted","Data":"2cc2f7efbe318a18fd099738bd2602f333d221a4ce5bc4b41ba833cc08a82b31"} Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.347598 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-zldpr"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.381441 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmcsj\" (UniqueName: \"kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.381509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.381549 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.381589 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.384834 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.385512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.386738 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.402278 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmcsj\" (UniqueName: \"kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj\") pod \"nova-cell1-conductor-db-sync-6cpft\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: W1204 01:14:44.457600 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4852d738_7bda_453f_b872_373712269eae.slice/crio-b4b28b6cf159a2f718c4edcd9723e92ac9a8d684b9af687c3a05b4f07f38d8f1 WatchSource:0}: Error finding container b4b28b6cf159a2f718c4edcd9723e92ac9a8d684b9af687c3a05b4f07f38d8f1: Status 404 returned error can't find the container with id b4b28b6cf159a2f718c4edcd9723e92ac9a8d684b9af687c3a05b4f07f38d8f1 Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.464773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.514244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:44 crc kubenswrapper[4764]: W1204 01:14:44.583480 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff1be383_0850_4d50_99ff_11e399ef6799.slice/crio-538235bfa70a736b2e9b4cb1557483592c3ca2ccb606ce1e2277195c219ff304 WatchSource:0}: Error finding container 538235bfa70a736b2e9b4cb1557483592c3ca2ccb606ce1e2277195c219ff304: Status 404 returned error can't find the container with id 538235bfa70a736b2e9b4cb1557483592c3ca2ccb606ce1e2277195c219ff304 Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.583659 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.586363 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.602602 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:44 crc kubenswrapper[4764]: I1204 01:14:44.995521 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6cpft"] Dec 04 01:14:44 crc kubenswrapper[4764]: W1204 01:14:44.997997 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fdc2149_703a_4038_9b06_75f9c5ef5f21.slice/crio-d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5 WatchSource:0}: Error finding container d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5: Status 404 returned error can't find the container with id d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5 Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.336706 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6cpft" event={"ID":"5fdc2149-703a-4038-9b06-75f9c5ef5f21","Type":"ContainerStarted","Data":"60160ebe62b3473a9a59940969d4c0ca7d19afad430ba5cb9613978c206d5402"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.336775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6cpft" event={"ID":"5fdc2149-703a-4038-9b06-75f9c5ef5f21","Type":"ContainerStarted","Data":"d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.364260 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16b3743-1815-4473-84f5-0cb21a1bebee","Type":"ContainerStarted","Data":"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.366398 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerStarted","Data":"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.366536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerStarted","Data":"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.366616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerStarted","Data":"538235bfa70a736b2e9b4cb1557483592c3ca2ccb606ce1e2277195c219ff304"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.374781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4852d738-7bda-453f-b872-373712269eae","Type":"ContainerStarted","Data":"ed8c99b2c406adc69b78bfa7bebd1f680e93995a6c2adaf0ee3521de287762b1"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.374829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4852d738-7bda-453f-b872-373712269eae","Type":"ContainerStarted","Data":"b4b28b6cf159a2f718c4edcd9723e92ac9a8d684b9af687c3a05b4f07f38d8f1"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.385439 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerStarted","Data":"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.385495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerStarted","Data":"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.385506 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerStarted","Data":"d3c5d20a96389a39b721bbccfa21b03003a31c45867287dae0b551d6331250c5"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.387804 4764 generic.go:334] "Generic (PLEG): container finished" podID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerID="d78fcf491b719445bce8b67ea2ccd011109a985b5ce6b037d8c75075bfd0aff6" exitCode=0 Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.390075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" event={"ID":"e1f2ea3d-8c91-45e3-92f8-87308627efeb","Type":"ContainerDied","Data":"d78fcf491b719445bce8b67ea2ccd011109a985b5ce6b037d8c75075bfd0aff6"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.390109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" event={"ID":"e1f2ea3d-8c91-45e3-92f8-87308627efeb","Type":"ContainerStarted","Data":"93835aa8d35e3186ad31513169a748e50a7d91c0bdc366c5fce1278d185945d4"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.396708 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zldpr" event={"ID":"adf4de2e-ddef-407f-bc65-e40c822a8a93","Type":"ContainerStarted","Data":"6efddbb211d9d38a885d9ae01978c9959a810d165236491985b62a83dffd366d"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.396756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zldpr" event={"ID":"adf4de2e-ddef-407f-bc65-e40c822a8a93","Type":"ContainerStarted","Data":"feb2d6a8567379193927f0d9e9cb1fcea3075d8bb67a53a3b501cb621fead022"} Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.414354 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-6cpft" podStartSLOduration=1.414329866 podStartE2EDuration="1.414329866s" podCreationTimestamp="2025-12-04 01:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.381256782 +0000 UTC m=+5621.142581193" watchObservedRunningTime="2025-12-04 01:14:45.414329866 +0000 UTC m=+5621.175654287" Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.459791 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.459695323 podStartE2EDuration="2.459695323s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.421568404 +0000 UTC m=+5621.182892815" watchObservedRunningTime="2025-12-04 01:14:45.459695323 +0000 UTC m=+5621.221019734" Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.503172 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.5031533230000003 podStartE2EDuration="2.503153323s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.482453883 +0000 UTC m=+5621.243778314" watchObservedRunningTime="2025-12-04 01:14:45.503153323 +0000 UTC m=+5621.264477734" Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.527998 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.527981044 podStartE2EDuration="2.527981044s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.51807054 +0000 UTC m=+5621.279394971" watchObservedRunningTime="2025-12-04 01:14:45.527981044 +0000 UTC m=+5621.289305455" Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.595441 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.595395383 podStartE2EDuration="2.595395383s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.558406633 +0000 UTC m=+5621.319731044" watchObservedRunningTime="2025-12-04 01:14:45.595395383 +0000 UTC m=+5621.356719794" Dec 04 01:14:45 crc kubenswrapper[4764]: I1204 01:14:45.647760 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-zldpr" podStartSLOduration=2.647739192 podStartE2EDuration="2.647739192s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:45.577322438 +0000 UTC m=+5621.338646849" watchObservedRunningTime="2025-12-04 01:14:45.647739192 +0000 UTC m=+5621.409063603" Dec 04 01:14:46 crc kubenswrapper[4764]: I1204 01:14:46.412578 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" event={"ID":"e1f2ea3d-8c91-45e3-92f8-87308627efeb","Type":"ContainerStarted","Data":"edd93145d1bedd3c82c160d5efebed40d403e24ca31b5af3b034c6c490a32b42"} Dec 04 01:14:46 crc kubenswrapper[4764]: I1204 01:14:46.446318 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" podStartSLOduration=3.44630002 podStartE2EDuration="3.44630002s" podCreationTimestamp="2025-12-04 01:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:46.436692683 +0000 UTC m=+5622.198017114" watchObservedRunningTime="2025-12-04 01:14:46.44630002 +0000 UTC m=+5622.207624431" Dec 04 01:14:47 crc kubenswrapper[4764]: I1204 01:14:47.421781 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.432939 4764 generic.go:334] "Generic (PLEG): container finished" podID="5fdc2149-703a-4038-9b06-75f9c5ef5f21" containerID="60160ebe62b3473a9a59940969d4c0ca7d19afad430ba5cb9613978c206d5402" exitCode=0 Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.433096 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6cpft" event={"ID":"5fdc2149-703a-4038-9b06-75f9c5ef5f21","Type":"ContainerDied","Data":"60160ebe62b3473a9a59940969d4c0ca7d19afad430ba5cb9613978c206d5402"} Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.691535 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.846235 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.958249 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:14:48 crc kubenswrapper[4764]: I1204 01:14:48.958309 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.446479 4764 generic.go:334] "Generic (PLEG): container finished" podID="adf4de2e-ddef-407f-bc65-e40c822a8a93" containerID="6efddbb211d9d38a885d9ae01978c9959a810d165236491985b62a83dffd366d" exitCode=0 Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.446619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zldpr" event={"ID":"adf4de2e-ddef-407f-bc65-e40c822a8a93","Type":"ContainerDied","Data":"6efddbb211d9d38a885d9ae01978c9959a810d165236491985b62a83dffd366d"} Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.878408 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.997201 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data\") pod \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.997302 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmcsj\" (UniqueName: \"kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj\") pod \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.997466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts\") pod \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " Dec 04 01:14:49 crc kubenswrapper[4764]: I1204 01:14:49.997515 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle\") pod \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\" (UID: \"5fdc2149-703a-4038-9b06-75f9c5ef5f21\") " Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.003427 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts" (OuterVolumeSpecName: "scripts") pod "5fdc2149-703a-4038-9b06-75f9c5ef5f21" (UID: "5fdc2149-703a-4038-9b06-75f9c5ef5f21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.004221 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj" (OuterVolumeSpecName: "kube-api-access-jmcsj") pod "5fdc2149-703a-4038-9b06-75f9c5ef5f21" (UID: "5fdc2149-703a-4038-9b06-75f9c5ef5f21"). InnerVolumeSpecName "kube-api-access-jmcsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.027012 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fdc2149-703a-4038-9b06-75f9c5ef5f21" (UID: "5fdc2149-703a-4038-9b06-75f9c5ef5f21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.027342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data" (OuterVolumeSpecName: "config-data") pod "5fdc2149-703a-4038-9b06-75f9c5ef5f21" (UID: "5fdc2149-703a-4038-9b06-75f9c5ef5f21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.100212 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.100267 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmcsj\" (UniqueName: \"kubernetes.io/projected/5fdc2149-703a-4038-9b06-75f9c5ef5f21-kube-api-access-jmcsj\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.100280 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.100289 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fdc2149-703a-4038-9b06-75f9c5ef5f21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.457606 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-6cpft" event={"ID":"5fdc2149-703a-4038-9b06-75f9c5ef5f21","Type":"ContainerDied","Data":"d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5"} Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.459805 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1007d7ee19933b47af031eaf9ef877ca8d82d566ef8547cde9631d23cc6d2c5" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.457639 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-6cpft" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.543547 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:14:50 crc kubenswrapper[4764]: E1204 01:14:50.544020 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fdc2149-703a-4038-9b06-75f9c5ef5f21" containerName="nova-cell1-conductor-db-sync" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.544041 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fdc2149-703a-4038-9b06-75f9c5ef5f21" containerName="nova-cell1-conductor-db-sync" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.544263 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fdc2149-703a-4038-9b06-75f9c5ef5f21" containerName="nova-cell1-conductor-db-sync" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.545350 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.547886 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.562067 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.611041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.611453 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.611531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6267\" (UniqueName: \"kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.713416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.713459 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6267\" (UniqueName: \"kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.713524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.718645 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.719055 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.744373 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6267\" (UniqueName: \"kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267\") pod \"nova-cell1-conductor-0\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.877929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:50 crc kubenswrapper[4764]: I1204 01:14:50.997966 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.126224 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts\") pod \"adf4de2e-ddef-407f-bc65-e40c822a8a93\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.126415 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle\") pod \"adf4de2e-ddef-407f-bc65-e40c822a8a93\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.126479 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data\") pod \"adf4de2e-ddef-407f-bc65-e40c822a8a93\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.126543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wjt\" (UniqueName: \"kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt\") pod \"adf4de2e-ddef-407f-bc65-e40c822a8a93\" (UID: \"adf4de2e-ddef-407f-bc65-e40c822a8a93\") " Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.132960 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt" (OuterVolumeSpecName: "kube-api-access-g8wjt") pod "adf4de2e-ddef-407f-bc65-e40c822a8a93" (UID: "adf4de2e-ddef-407f-bc65-e40c822a8a93"). InnerVolumeSpecName "kube-api-access-g8wjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.135433 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts" (OuterVolumeSpecName: "scripts") pod "adf4de2e-ddef-407f-bc65-e40c822a8a93" (UID: "adf4de2e-ddef-407f-bc65-e40c822a8a93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.153661 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data" (OuterVolumeSpecName: "config-data") pod "adf4de2e-ddef-407f-bc65-e40c822a8a93" (UID: "adf4de2e-ddef-407f-bc65-e40c822a8a93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.156769 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adf4de2e-ddef-407f-bc65-e40c822a8a93" (UID: "adf4de2e-ddef-407f-bc65-e40c822a8a93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.229273 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.229321 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.229340 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wjt\" (UniqueName: \"kubernetes.io/projected/adf4de2e-ddef-407f-bc65-e40c822a8a93-kube-api-access-g8wjt\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.229359 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adf4de2e-ddef-407f-bc65-e40c822a8a93-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.329253 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:14:51 crc kubenswrapper[4764]: W1204 01:14:51.341213 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15ee17b2_31d9_499e_aea4_713c272534f8.slice/crio-052d738f70a481b8918cc50b263823970dad605d73783e92e656fcf3971b53f0 WatchSource:0}: Error finding container 052d738f70a481b8918cc50b263823970dad605d73783e92e656fcf3971b53f0: Status 404 returned error can't find the container with id 052d738f70a481b8918cc50b263823970dad605d73783e92e656fcf3971b53f0 Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.469153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"15ee17b2-31d9-499e-aea4-713c272534f8","Type":"ContainerStarted","Data":"052d738f70a481b8918cc50b263823970dad605d73783e92e656fcf3971b53f0"} Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.471228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-zldpr" event={"ID":"adf4de2e-ddef-407f-bc65-e40c822a8a93","Type":"ContainerDied","Data":"feb2d6a8567379193927f0d9e9cb1fcea3075d8bb67a53a3b501cb621fead022"} Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.471268 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb2d6a8567379193927f0d9e9cb1fcea3075d8bb67a53a3b501cb621fead022" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.471298 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-zldpr" Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.644770 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.645001 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-log" containerID="cri-o://9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" gracePeriod=30 Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.645089 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-api" containerID="cri-o://588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" gracePeriod=30 Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.674196 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.674431 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4852d738-7bda-453f-b872-373712269eae" containerName="nova-scheduler-scheduler" containerID="cri-o://ed8c99b2c406adc69b78bfa7bebd1f680e93995a6c2adaf0ee3521de287762b1" gracePeriod=30 Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.688880 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.689170 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-log" containerID="cri-o://f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" gracePeriod=30 Dec 04 01:14:51 crc kubenswrapper[4764]: I1204 01:14:51.689682 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-metadata" containerID="cri-o://f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" gracePeriod=30 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.279106 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.285115 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349414 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle\") pod \"a55f8d14-d6c2-4de9-8a3b-821308938861\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349500 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle\") pod \"ff1be383-0850-4d50-99ff-11e399ef6799\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349540 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h2vg\" (UniqueName: \"kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg\") pod \"a55f8d14-d6c2-4de9-8a3b-821308938861\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349620 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs\") pod \"a55f8d14-d6c2-4de9-8a3b-821308938861\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349677 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crjkz\" (UniqueName: \"kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz\") pod \"ff1be383-0850-4d50-99ff-11e399ef6799\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349735 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data\") pod \"a55f8d14-d6c2-4de9-8a3b-821308938861\" (UID: \"a55f8d14-d6c2-4de9-8a3b-821308938861\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349756 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data\") pod \"ff1be383-0850-4d50-99ff-11e399ef6799\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.349976 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs\") pod \"ff1be383-0850-4d50-99ff-11e399ef6799\" (UID: \"ff1be383-0850-4d50-99ff-11e399ef6799\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.350650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs" (OuterVolumeSpecName: "logs") pod "ff1be383-0850-4d50-99ff-11e399ef6799" (UID: "ff1be383-0850-4d50-99ff-11e399ef6799"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.354853 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg" (OuterVolumeSpecName: "kube-api-access-9h2vg") pod "a55f8d14-d6c2-4de9-8a3b-821308938861" (UID: "a55f8d14-d6c2-4de9-8a3b-821308938861"). InnerVolumeSpecName "kube-api-access-9h2vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.355110 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs" (OuterVolumeSpecName: "logs") pod "a55f8d14-d6c2-4de9-8a3b-821308938861" (UID: "a55f8d14-d6c2-4de9-8a3b-821308938861"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.360491 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz" (OuterVolumeSpecName: "kube-api-access-crjkz") pod "ff1be383-0850-4d50-99ff-11e399ef6799" (UID: "ff1be383-0850-4d50-99ff-11e399ef6799"). InnerVolumeSpecName "kube-api-access-crjkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.377531 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a55f8d14-d6c2-4de9-8a3b-821308938861" (UID: "a55f8d14-d6c2-4de9-8a3b-821308938861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.382909 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data" (OuterVolumeSpecName: "config-data") pod "ff1be383-0850-4d50-99ff-11e399ef6799" (UID: "ff1be383-0850-4d50-99ff-11e399ef6799"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.403070 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data" (OuterVolumeSpecName: "config-data") pod "a55f8d14-d6c2-4de9-8a3b-821308938861" (UID: "a55f8d14-d6c2-4de9-8a3b-821308938861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.403356 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff1be383-0850-4d50-99ff-11e399ef6799" (UID: "ff1be383-0850-4d50-99ff-11e399ef6799"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454135 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff1be383-0850-4d50-99ff-11e399ef6799-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454193 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454214 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454231 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h2vg\" (UniqueName: \"kubernetes.io/projected/a55f8d14-d6c2-4de9-8a3b-821308938861-kube-api-access-9h2vg\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454248 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a55f8d14-d6c2-4de9-8a3b-821308938861-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454264 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crjkz\" (UniqueName: \"kubernetes.io/projected/ff1be383-0850-4d50-99ff-11e399ef6799-kube-api-access-crjkz\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454280 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a55f8d14-d6c2-4de9-8a3b-821308938861-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.454296 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff1be383-0850-4d50-99ff-11e399ef6799-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.483676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"15ee17b2-31d9-499e-aea4-713c272534f8","Type":"ContainerStarted","Data":"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.485794 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488583 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff1be383-0850-4d50-99ff-11e399ef6799" containerID="588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" exitCode=0 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488617 4764 generic.go:334] "Generic (PLEG): container finished" podID="ff1be383-0850-4d50-99ff-11e399ef6799" containerID="9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" exitCode=143 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488665 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerDied","Data":"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerDied","Data":"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488701 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff1be383-0850-4d50-99ff-11e399ef6799","Type":"ContainerDied","Data":"538235bfa70a736b2e9b4cb1557483592c3ca2ccb606ce1e2277195c219ff304"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488753 4764 scope.go:117] "RemoveContainer" containerID="588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.488881 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.500545 4764 generic.go:334] "Generic (PLEG): container finished" podID="4852d738-7bda-453f-b872-373712269eae" containerID="ed8c99b2c406adc69b78bfa7bebd1f680e93995a6c2adaf0ee3521de287762b1" exitCode=0 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.500640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4852d738-7bda-453f-b872-373712269eae","Type":"ContainerDied","Data":"ed8c99b2c406adc69b78bfa7bebd1f680e93995a6c2adaf0ee3521de287762b1"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504031 4764 generic.go:334] "Generic (PLEG): container finished" podID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerID="f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" exitCode=0 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504060 4764 generic.go:334] "Generic (PLEG): container finished" podID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerID="f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" exitCode=143 Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504100 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerDied","Data":"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504126 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerDied","Data":"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504139 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a55f8d14-d6c2-4de9-8a3b-821308938861","Type":"ContainerDied","Data":"d3c5d20a96389a39b721bbccfa21b03003a31c45867287dae0b551d6331250c5"} Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.504218 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.525419 4764 scope.go:117] "RemoveContainer" containerID="9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.548163 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.548135564 podStartE2EDuration="2.548135564s" podCreationTimestamp="2025-12-04 01:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:52.500931612 +0000 UTC m=+5628.262256023" watchObservedRunningTime="2025-12-04 01:14:52.548135564 +0000 UTC m=+5628.309459975" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.555417 4764 scope.go:117] "RemoveContainer" containerID="588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.555934 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf\": container with ID starting with 588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf not found: ID does not exist" containerID="588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.555974 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf"} err="failed to get container status \"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf\": rpc error: code = NotFound desc = could not find container \"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf\": container with ID starting with 588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.556000 4764 scope.go:117] "RemoveContainer" containerID="9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.556311 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98\": container with ID starting with 9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98 not found: ID does not exist" containerID="9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.556348 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98"} err="failed to get container status \"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98\": rpc error: code = NotFound desc = could not find container \"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98\": container with ID starting with 9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.556372 4764 scope.go:117] "RemoveContainer" containerID="588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.557248 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf"} err="failed to get container status \"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf\": rpc error: code = NotFound desc = could not find container \"588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf\": container with ID starting with 588b5867ac6aa8bf3ac5d2cdb375b90cd449d16ab9312ab7c6ac3aa9608eb9bf not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.557270 4764 scope.go:117] "RemoveContainer" containerID="9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.557756 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.558048 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98"} err="failed to get container status \"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98\": rpc error: code = NotFound desc = could not find container \"9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98\": container with ID starting with 9876906d2898dd833881f443ab610427031b2e10fa4259cc231f66e87aa58d98 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.558068 4764 scope.go:117] "RemoveContainer" containerID="f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.565178 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.602122 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.602175 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.617606 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618046 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-api" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618071 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-api" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618088 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf4de2e-ddef-407f-bc65-e40c822a8a93" containerName="nova-manage" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618095 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf4de2e-ddef-407f-bc65-e40c822a8a93" containerName="nova-manage" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618129 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-log" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618135 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-log" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618146 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4852d738-7bda-453f-b872-373712269eae" containerName="nova-scheduler-scheduler" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618153 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4852d738-7bda-453f-b872-373712269eae" containerName="nova-scheduler-scheduler" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618163 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-log" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618170 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-log" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.618183 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-metadata" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618188 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-metadata" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618348 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-log" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618360 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-metadata" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618372 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4852d738-7bda-453f-b872-373712269eae" containerName="nova-scheduler-scheduler" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618385 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" containerName="nova-api-api" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618395 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf4de2e-ddef-407f-bc65-e40c822a8a93" containerName="nova-manage" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.618405 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" containerName="nova-metadata-log" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.619306 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.619395 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.620001 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.622965 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.628299 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.630343 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.633106 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.635054 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.663909 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vtg\" (UniqueName: \"kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg\") pod \"4852d738-7bda-453f-b872-373712269eae\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664016 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle\") pod \"4852d738-7bda-453f-b872-373712269eae\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664109 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data\") pod \"4852d738-7bda-453f-b872-373712269eae\" (UID: \"4852d738-7bda-453f-b872-373712269eae\") " Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664727 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mphrc\" (UniqueName: \"kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.664928 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.679548 4764 scope.go:117] "RemoveContainer" containerID="f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.695809 4764 scope.go:117] "RemoveContainer" containerID="f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.696251 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194\": container with ID starting with f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194 not found: ID does not exist" containerID="f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696283 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194"} err="failed to get container status \"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194\": rpc error: code = NotFound desc = could not find container \"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194\": container with ID starting with f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696309 4764 scope.go:117] "RemoveContainer" containerID="f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" Dec 04 01:14:52 crc kubenswrapper[4764]: E1204 01:14:52.696571 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0\": container with ID starting with f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0 not found: ID does not exist" containerID="f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696596 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0"} err="failed to get container status \"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0\": rpc error: code = NotFound desc = could not find container \"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0\": container with ID starting with f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696617 4764 scope.go:117] "RemoveContainer" containerID="f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696887 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194"} err="failed to get container status \"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194\": rpc error: code = NotFound desc = could not find container \"f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194\": container with ID starting with f85bbce89e553f19700f13c6af3591e181f0f55757db6e2957f20c494e7cf194 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.696908 4764 scope.go:117] "RemoveContainer" containerID="f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.700199 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0"} err="failed to get container status \"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0\": rpc error: code = NotFound desc = could not find container \"f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0\": container with ID starting with f4e106dc2f2320ceba6450463450840a70f3df7c990f8a3bafa44d95ad8a70d0 not found: ID does not exist" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.728460 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg" (OuterVolumeSpecName: "kube-api-access-m6vtg") pod "4852d738-7bda-453f-b872-373712269eae" (UID: "4852d738-7bda-453f-b872-373712269eae"). InnerVolumeSpecName "kube-api-access-m6vtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.732624 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data" (OuterVolumeSpecName: "config-data") pod "4852d738-7bda-453f-b872-373712269eae" (UID: "4852d738-7bda-453f-b872-373712269eae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.733147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4852d738-7bda-453f-b872-373712269eae" (UID: "4852d738-7bda-453f-b872-373712269eae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.766808 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.766905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.766977 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767008 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mphrc\" (UniqueName: \"kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767099 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4cq\" (UniqueName: \"kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767129 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767182 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767196 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6vtg\" (UniqueName: \"kubernetes.io/projected/4852d738-7bda-453f-b872-373712269eae-kube-api-access-m6vtg\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767205 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4852d738-7bda-453f-b872-373712269eae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.767471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.771334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.772127 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.789562 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mphrc\" (UniqueName: \"kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc\") pod \"nova-api-0\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.868738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.868810 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.868856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.869224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.869288 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4cq\" (UniqueName: \"kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.872340 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.872901 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.893698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4cq\" (UniqueName: \"kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq\") pod \"nova-metadata-0\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " pod="openstack/nova-metadata-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.935733 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:14:52 crc kubenswrapper[4764]: I1204 01:14:52.948840 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.444360 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.514156 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerStarted","Data":"433fa3ec5ce4d90c7ef26e828b2b5e4722a1f63fc02c55565448005b634d0d23"} Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.517970 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.519905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4852d738-7bda-453f-b872-373712269eae","Type":"ContainerDied","Data":"b4b28b6cf159a2f718c4edcd9723e92ac9a8d684b9af687c3a05b4f07f38d8f1"} Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.519957 4764 scope.go:117] "RemoveContainer" containerID="ed8c99b2c406adc69b78bfa7bebd1f680e93995a6c2adaf0ee3521de287762b1" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.584611 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.597700 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.607764 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.609148 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.612055 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.620049 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.685386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.685461 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.685634 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnh88\" (UniqueName: \"kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.696902 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.711224 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.769414 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.786792 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.786829 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.786972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnh88\" (UniqueName: \"kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.792471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.792531 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.803434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnh88\" (UniqueName: \"kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88\") pod \"nova-scheduler-0\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " pod="openstack/nova-scheduler-0" Dec 04 01:14:53 crc kubenswrapper[4764]: I1204 01:14:53.926981 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.037130 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.113109 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.113331 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="dnsmasq-dns" containerID="cri-o://86a404821c1b0b4c38d049838203bad535961ecbb664ad5078e75aeb0e192169" gracePeriod=10 Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.503972 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.532314 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30c0aebf-0e0d-4707-82ec-ab031064f14c","Type":"ContainerStarted","Data":"cd8140a9fefafc89201798967a034fde500f69a7cf4809b9d17df3f9162b730c"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.545661 4764 generic.go:334] "Generic (PLEG): container finished" podID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerID="86a404821c1b0b4c38d049838203bad535961ecbb664ad5078e75aeb0e192169" exitCode=0 Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.588649 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4852d738-7bda-453f-b872-373712269eae" path="/var/lib/kubelet/pods/4852d738-7bda-453f-b872-373712269eae/volumes" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.589353 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a55f8d14-d6c2-4de9-8a3b-821308938861" path="/var/lib/kubelet/pods/a55f8d14-d6c2-4de9-8a3b-821308938861/volumes" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.590049 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1be383-0850-4d50-99ff-11e399ef6799" path="/var/lib/kubelet/pods/ff1be383-0850-4d50-99ff-11e399ef6799/volumes" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" event={"ID":"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4","Type":"ContainerDied","Data":"86a404821c1b0b4c38d049838203bad535961ecbb664ad5078e75aeb0e192169"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591532 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerStarted","Data":"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591561 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerStarted","Data":"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591571 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerStarted","Data":"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerStarted","Data":"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.591594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerStarted","Data":"472ba4ff394845060394a53a591417da776a28887417d07a69271e816205f516"} Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.616065 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.620259 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.673935 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.673917163 podStartE2EDuration="2.673917163s" podCreationTimestamp="2025-12-04 01:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:54.672072378 +0000 UTC m=+5630.433396789" watchObservedRunningTime="2025-12-04 01:14:54.673917163 +0000 UTC m=+5630.435241574" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.720272 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.720253804 podStartE2EDuration="2.720253804s" podCreationTimestamp="2025-12-04 01:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:54.713880947 +0000 UTC m=+5630.475205378" watchObservedRunningTime="2025-12-04 01:14:54.720253804 +0000 UTC m=+5630.481578215" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.745478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb\") pod \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.745536 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config\") pod \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.745605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc\") pod \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.745696 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb\") pod \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.745846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6ll\" (UniqueName: \"kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll\") pod \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\" (UID: \"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4\") " Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.768470 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll" (OuterVolumeSpecName: "kube-api-access-zz6ll") pod "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" (UID: "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4"). InnerVolumeSpecName "kube-api-access-zz6ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.798186 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" (UID: "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.807226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config" (OuterVolumeSpecName: "config") pod "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" (UID: "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.818677 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" (UID: "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.825267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" (UID: "0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.851245 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.851294 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.851306 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.851330 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:54 crc kubenswrapper[4764]: I1204 01:14:54.851359 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6ll\" (UniqueName: \"kubernetes.io/projected/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4-kube-api-access-zz6ll\") on node \"crc\" DevicePath \"\"" Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.580751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30c0aebf-0e0d-4707-82ec-ab031064f14c","Type":"ContainerStarted","Data":"d797ebbed765f1c892aa484ee943bd72ec7ccfc2e72572f5cc782653f50ffdd2"} Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.583573 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" event={"ID":"0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4","Type":"ContainerDied","Data":"e6c2d6ef6853ea0d3c809a9231f09338967100e6faa18a7d37b204b37124ed56"} Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.583631 4764 scope.go:117] "RemoveContainer" containerID="86a404821c1b0b4c38d049838203bad535961ecbb664ad5078e75aeb0e192169" Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.583840 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c95ff496c-ntszq" Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.605529 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6055076059999998 podStartE2EDuration="2.605507606s" podCreationTimestamp="2025-12-04 01:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:14:55.602339018 +0000 UTC m=+5631.363663469" watchObservedRunningTime="2025-12-04 01:14:55.605507606 +0000 UTC m=+5631.366832017" Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.621442 4764 scope.go:117] "RemoveContainer" containerID="d18228b4af3f21059dc9de2c989e2945ec2eed3c6c01a8e1241f107e3a22092f" Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.647204 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:14:55 crc kubenswrapper[4764]: I1204 01:14:55.658112 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c95ff496c-ntszq"] Dec 04 01:14:56 crc kubenswrapper[4764]: I1204 01:14:56.560764 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" path="/var/lib/kubelet/pods/0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4/volumes" Dec 04 01:14:57 crc kubenswrapper[4764]: I1204 01:14:57.949853 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:14:57 crc kubenswrapper[4764]: I1204 01:14:57.952933 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:14:58 crc kubenswrapper[4764]: I1204 01:14:58.927622 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.130897 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k"] Dec 04 01:15:00 crc kubenswrapper[4764]: E1204 01:15:00.131591 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="dnsmasq-dns" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.131605 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="dnsmasq-dns" Dec 04 01:15:00 crc kubenswrapper[4764]: E1204 01:15:00.131618 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="init" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.131624 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="init" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.131830 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a2a5e07-ea25-4bf6-8e58-3ed40d82ced4" containerName="dnsmasq-dns" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.132473 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.134256 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.135118 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.178217 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k"] Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.276738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtc66\" (UniqueName: \"kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.276834 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.276854 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.378841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.379139 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.379327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtc66\" (UniqueName: \"kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.379682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.399764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.411275 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtc66\" (UniqueName: \"kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66\") pod \"collect-profiles-29413515-vws9k\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.461844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.893134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k"] Dec 04 01:15:00 crc kubenswrapper[4764]: I1204 01:15:00.941861 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.430168 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pdqmj"] Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.431652 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.434158 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.434342 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.439450 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pdqmj"] Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.602406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.602479 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.602652 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.602978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9q4l\" (UniqueName: \"kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.651550 4764 generic.go:334] "Generic (PLEG): container finished" podID="d933b64c-ee5e-4ad7-b62d-36db00ebef8b" containerID="a2b92b56099585bb86e902a2ad8678367bfc0b765db991f43d8d751f33ec46bb" exitCode=0 Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.651603 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" event={"ID":"d933b64c-ee5e-4ad7-b62d-36db00ebef8b","Type":"ContainerDied","Data":"a2b92b56099585bb86e902a2ad8678367bfc0b765db991f43d8d751f33ec46bb"} Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.651637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" event={"ID":"d933b64c-ee5e-4ad7-b62d-36db00ebef8b","Type":"ContainerStarted","Data":"d6d1a6379b23a90be1b591822375c3de4a78a6cd3d3316489e9740340135793b"} Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.704553 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9q4l\" (UniqueName: \"kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.704623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.704682 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.704729 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.709799 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.709975 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.710250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.719843 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9q4l\" (UniqueName: \"kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l\") pod \"nova-cell1-cell-mapping-pdqmj\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:01 crc kubenswrapper[4764]: I1204 01:15:01.768190 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.245408 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pdqmj"] Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.668865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pdqmj" event={"ID":"f4989134-2aa5-4637-8256-6e5557b657e6","Type":"ContainerStarted","Data":"f58cc76dfb1041d9c807ee6dd726ef98574cde48e0cc21f8e4e7658fab04785d"} Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.668927 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pdqmj" event={"ID":"f4989134-2aa5-4637-8256-6e5557b657e6","Type":"ContainerStarted","Data":"0d7c3c9142debb2946902899056713619a89651f9ae78be60f9bc851bbe411d3"} Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.692226 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pdqmj" podStartSLOduration=1.692208954 podStartE2EDuration="1.692208954s" podCreationTimestamp="2025-12-04 01:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:02.683350336 +0000 UTC m=+5638.444674747" watchObservedRunningTime="2025-12-04 01:15:02.692208954 +0000 UTC m=+5638.453533365" Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.940073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.940112 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.949608 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:15:02 crc kubenswrapper[4764]: I1204 01:15:02.949662 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.055536 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.130082 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume\") pod \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.130149 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtc66\" (UniqueName: \"kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66\") pod \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.130253 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume\") pod \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\" (UID: \"d933b64c-ee5e-4ad7-b62d-36db00ebef8b\") " Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.130959 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d933b64c-ee5e-4ad7-b62d-36db00ebef8b" (UID: "d933b64c-ee5e-4ad7-b62d-36db00ebef8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.144300 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d933b64c-ee5e-4ad7-b62d-36db00ebef8b" (UID: "d933b64c-ee5e-4ad7-b62d-36db00ebef8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.144644 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66" (OuterVolumeSpecName: "kube-api-access-rtc66") pod "d933b64c-ee5e-4ad7-b62d-36db00ebef8b" (UID: "d933b64c-ee5e-4ad7-b62d-36db00ebef8b"). InnerVolumeSpecName "kube-api-access-rtc66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.232730 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.232760 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtc66\" (UniqueName: \"kubernetes.io/projected/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-kube-api-access-rtc66\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.232770 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d933b64c-ee5e-4ad7-b62d-36db00ebef8b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.677935 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" event={"ID":"d933b64c-ee5e-4ad7-b62d-36db00ebef8b","Type":"ContainerDied","Data":"d6d1a6379b23a90be1b591822375c3de4a78a6cd3d3316489e9740340135793b"} Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.677981 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6d1a6379b23a90be1b591822375c3de4a78a6cd3d3316489e9740340135793b" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.678857 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.927603 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 01:15:03 crc kubenswrapper[4764]: I1204 01:15:03.982421 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.108266 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.108893 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.109126 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.109576 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.188448 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8"] Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.197414 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413470-kcvf8"] Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.565607 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2a5047-a6e6-4243-86dc-4ce470ab83af" path="/var/lib/kubelet/pods/1b2a5047-a6e6-4243-86dc-4ce470ab83af/volumes" Dec 04 01:15:04 crc kubenswrapper[4764]: I1204 01:15:04.728402 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 01:15:07 crc kubenswrapper[4764]: I1204 01:15:07.721530 4764 generic.go:334] "Generic (PLEG): container finished" podID="f4989134-2aa5-4637-8256-6e5557b657e6" containerID="f58cc76dfb1041d9c807ee6dd726ef98574cde48e0cc21f8e4e7658fab04785d" exitCode=0 Dec 04 01:15:07 crc kubenswrapper[4764]: I1204 01:15:07.721645 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pdqmj" event={"ID":"f4989134-2aa5-4637-8256-6e5557b657e6","Type":"ContainerDied","Data":"f58cc76dfb1041d9c807ee6dd726ef98574cde48e0cc21f8e4e7658fab04785d"} Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.085268 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.142233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9q4l\" (UniqueName: \"kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l\") pod \"f4989134-2aa5-4637-8256-6e5557b657e6\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.142400 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data\") pod \"f4989134-2aa5-4637-8256-6e5557b657e6\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.142487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts\") pod \"f4989134-2aa5-4637-8256-6e5557b657e6\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.142553 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle\") pod \"f4989134-2aa5-4637-8256-6e5557b657e6\" (UID: \"f4989134-2aa5-4637-8256-6e5557b657e6\") " Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.148498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts" (OuterVolumeSpecName: "scripts") pod "f4989134-2aa5-4637-8256-6e5557b657e6" (UID: "f4989134-2aa5-4637-8256-6e5557b657e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.149087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l" (OuterVolumeSpecName: "kube-api-access-g9q4l") pod "f4989134-2aa5-4637-8256-6e5557b657e6" (UID: "f4989134-2aa5-4637-8256-6e5557b657e6"). InnerVolumeSpecName "kube-api-access-g9q4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.169111 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data" (OuterVolumeSpecName: "config-data") pod "f4989134-2aa5-4637-8256-6e5557b657e6" (UID: "f4989134-2aa5-4637-8256-6e5557b657e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.175539 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4989134-2aa5-4637-8256-6e5557b657e6" (UID: "f4989134-2aa5-4637-8256-6e5557b657e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.245541 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9q4l\" (UniqueName: \"kubernetes.io/projected/f4989134-2aa5-4637-8256-6e5557b657e6-kube-api-access-g9q4l\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.245632 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.245680 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.245695 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4989134-2aa5-4637-8256-6e5557b657e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.751399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pdqmj" event={"ID":"f4989134-2aa5-4637-8256-6e5557b657e6","Type":"ContainerDied","Data":"0d7c3c9142debb2946902899056713619a89651f9ae78be60f9bc851bbe411d3"} Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.751459 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d7c3c9142debb2946902899056713619a89651f9ae78be60f9bc851bbe411d3" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.751482 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pdqmj" Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.870629 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.870951 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-log" containerID="cri-o://1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324" gracePeriod=30 Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.871090 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-api" containerID="cri-o://aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6" gracePeriod=30 Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.919375 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.919961 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="30c0aebf-0e0d-4707-82ec-ab031064f14c" containerName="nova-scheduler-scheduler" containerID="cri-o://d797ebbed765f1c892aa484ee943bd72ec7ccfc2e72572f5cc782653f50ffdd2" gracePeriod=30 Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.942839 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.943062 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-log" containerID="cri-o://1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd" gracePeriod=30 Dec 04 01:15:09 crc kubenswrapper[4764]: I1204 01:15:09.943446 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-metadata" containerID="cri-o://833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4" gracePeriod=30 Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.768111 4764 generic.go:334] "Generic (PLEG): container finished" podID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerID="1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd" exitCode=143 Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.768197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerDied","Data":"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd"} Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.770163 4764 generic.go:334] "Generic (PLEG): container finished" podID="30c0aebf-0e0d-4707-82ec-ab031064f14c" containerID="d797ebbed765f1c892aa484ee943bd72ec7ccfc2e72572f5cc782653f50ffdd2" exitCode=0 Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.770197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30c0aebf-0e0d-4707-82ec-ab031064f14c","Type":"ContainerDied","Data":"d797ebbed765f1c892aa484ee943bd72ec7ccfc2e72572f5cc782653f50ffdd2"} Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.771449 4764 generic.go:334] "Generic (PLEG): container finished" podID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerID="1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324" exitCode=143 Dec 04 01:15:10 crc kubenswrapper[4764]: I1204 01:15:10.771469 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerDied","Data":"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324"} Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.150937 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.291086 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnh88\" (UniqueName: \"kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88\") pod \"30c0aebf-0e0d-4707-82ec-ab031064f14c\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.291481 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle\") pod \"30c0aebf-0e0d-4707-82ec-ab031064f14c\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.291602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data\") pod \"30c0aebf-0e0d-4707-82ec-ab031064f14c\" (UID: \"30c0aebf-0e0d-4707-82ec-ab031064f14c\") " Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.297717 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88" (OuterVolumeSpecName: "kube-api-access-gnh88") pod "30c0aebf-0e0d-4707-82ec-ab031064f14c" (UID: "30c0aebf-0e0d-4707-82ec-ab031064f14c"). InnerVolumeSpecName "kube-api-access-gnh88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.318931 4764 scope.go:117] "RemoveContainer" containerID="bd33bb6a49ac286e9942db465fb51ac9ad537c6fa520686436cd674819daed2e" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.320116 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data" (OuterVolumeSpecName: "config-data") pod "30c0aebf-0e0d-4707-82ec-ab031064f14c" (UID: "30c0aebf-0e0d-4707-82ec-ab031064f14c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.334674 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30c0aebf-0e0d-4707-82ec-ab031064f14c" (UID: "30c0aebf-0e0d-4707-82ec-ab031064f14c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.394329 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.394372 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30c0aebf-0e0d-4707-82ec-ab031064f14c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.394387 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnh88\" (UniqueName: \"kubernetes.io/projected/30c0aebf-0e0d-4707-82ec-ab031064f14c-kube-api-access-gnh88\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.416517 4764 scope.go:117] "RemoveContainer" containerID="9b756fc09137183b5a66608c09aa8cb4764e3e6f4c356e4a53f7ab70fd9c2ad8" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.780566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"30c0aebf-0e0d-4707-82ec-ab031064f14c","Type":"ContainerDied","Data":"cd8140a9fefafc89201798967a034fde500f69a7cf4809b9d17df3f9162b730c"} Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.780625 4764 scope.go:117] "RemoveContainer" containerID="d797ebbed765f1c892aa484ee943bd72ec7ccfc2e72572f5cc782653f50ffdd2" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.780696 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.843682 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.847543 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.867815 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:11 crc kubenswrapper[4764]: E1204 01:15:11.868362 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4989134-2aa5-4637-8256-6e5557b657e6" containerName="nova-manage" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4989134-2aa5-4637-8256-6e5557b657e6" containerName="nova-manage" Dec 04 01:15:11 crc kubenswrapper[4764]: E1204 01:15:11.868424 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c0aebf-0e0d-4707-82ec-ab031064f14c" containerName="nova-scheduler-scheduler" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868441 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c0aebf-0e0d-4707-82ec-ab031064f14c" containerName="nova-scheduler-scheduler" Dec 04 01:15:11 crc kubenswrapper[4764]: E1204 01:15:11.868479 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d933b64c-ee5e-4ad7-b62d-36db00ebef8b" containerName="collect-profiles" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868494 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d933b64c-ee5e-4ad7-b62d-36db00ebef8b" containerName="collect-profiles" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868851 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c0aebf-0e0d-4707-82ec-ab031064f14c" containerName="nova-scheduler-scheduler" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868877 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4989134-2aa5-4637-8256-6e5557b657e6" containerName="nova-manage" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.868916 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d933b64c-ee5e-4ad7-b62d-36db00ebef8b" containerName="collect-profiles" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.870088 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.878139 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 01:15:11 crc kubenswrapper[4764]: I1204 01:15:11.900145 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.005091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqmhf\" (UniqueName: \"kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.005245 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.005265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.106696 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.106752 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.106822 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqmhf\" (UniqueName: \"kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.112235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.112234 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.135618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqmhf\" (UniqueName: \"kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf\") pod \"nova-scheduler-0\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.192822 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.555845 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c0aebf-0e0d-4707-82ec-ab031064f14c" path="/var/lib/kubelet/pods/30c0aebf-0e0d-4707-82ec-ab031064f14c/volumes" Dec 04 01:15:12 crc kubenswrapper[4764]: I1204 01:15:12.797632 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.484778 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.489443 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546042 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs\") pod \"c655b87d-83a0-4c58-adcf-c65d5364864b\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546161 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs\") pod \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546278 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle\") pod \"c655b87d-83a0-4c58-adcf-c65d5364864b\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546324 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data\") pod \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546402 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mphrc\" (UniqueName: \"kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc\") pod \"c655b87d-83a0-4c58-adcf-c65d5364864b\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546446 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4cq\" (UniqueName: \"kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq\") pod \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle\") pod \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\" (UID: \"09d2cba0-a1b9-4974-b37f-0c0c979543cf\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546618 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data\") pod \"c655b87d-83a0-4c58-adcf-c65d5364864b\" (UID: \"c655b87d-83a0-4c58-adcf-c65d5364864b\") " Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546782 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs" (OuterVolumeSpecName: "logs") pod "c655b87d-83a0-4c58-adcf-c65d5364864b" (UID: "c655b87d-83a0-4c58-adcf-c65d5364864b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.546826 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs" (OuterVolumeSpecName: "logs") pod "09d2cba0-a1b9-4974-b37f-0c0c979543cf" (UID: "09d2cba0-a1b9-4974-b37f-0c0c979543cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.547314 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c655b87d-83a0-4c58-adcf-c65d5364864b-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.547348 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09d2cba0-a1b9-4974-b37f-0c0c979543cf-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.551575 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc" (OuterVolumeSpecName: "kube-api-access-mphrc") pod "c655b87d-83a0-4c58-adcf-c65d5364864b" (UID: "c655b87d-83a0-4c58-adcf-c65d5364864b"). InnerVolumeSpecName "kube-api-access-mphrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.554891 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq" (OuterVolumeSpecName: "kube-api-access-xx4cq") pod "09d2cba0-a1b9-4974-b37f-0c0c979543cf" (UID: "09d2cba0-a1b9-4974-b37f-0c0c979543cf"). InnerVolumeSpecName "kube-api-access-xx4cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.571555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data" (OuterVolumeSpecName: "config-data") pod "09d2cba0-a1b9-4974-b37f-0c0c979543cf" (UID: "09d2cba0-a1b9-4974-b37f-0c0c979543cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.572340 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data" (OuterVolumeSpecName: "config-data") pod "c655b87d-83a0-4c58-adcf-c65d5364864b" (UID: "c655b87d-83a0-4c58-adcf-c65d5364864b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.574775 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c655b87d-83a0-4c58-adcf-c65d5364864b" (UID: "c655b87d-83a0-4c58-adcf-c65d5364864b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.574877 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09d2cba0-a1b9-4974-b37f-0c0c979543cf" (UID: "09d2cba0-a1b9-4974-b37f-0c0c979543cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649383 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649433 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649446 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mphrc\" (UniqueName: \"kubernetes.io/projected/c655b87d-83a0-4c58-adcf-c65d5364864b-kube-api-access-mphrc\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649461 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4cq\" (UniqueName: \"kubernetes.io/projected/09d2cba0-a1b9-4974-b37f-0c0c979543cf-kube-api-access-xx4cq\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649473 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d2cba0-a1b9-4974-b37f-0c0c979543cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.649484 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c655b87d-83a0-4c58-adcf-c65d5364864b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.803522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bdb5ccb-650f-4264-810f-0a3ad037f59e","Type":"ContainerStarted","Data":"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.803568 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bdb5ccb-650f-4264-810f-0a3ad037f59e","Type":"ContainerStarted","Data":"4a05aa02c6f8df427720a5c8daf71fff7a5f5585f421f94ca737a1d367fac159"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.805595 4764 generic.go:334] "Generic (PLEG): container finished" podID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerID="aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6" exitCode=0 Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.805650 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerDied","Data":"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.805668 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c655b87d-83a0-4c58-adcf-c65d5364864b","Type":"ContainerDied","Data":"433fa3ec5ce4d90c7ef26e828b2b5e4722a1f63fc02c55565448005b634d0d23"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.805674 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.805684 4764 scope.go:117] "RemoveContainer" containerID="aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.809260 4764 generic.go:334] "Generic (PLEG): container finished" podID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerID="833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4" exitCode=0 Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.809295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerDied","Data":"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.809313 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09d2cba0-a1b9-4974-b37f-0c0c979543cf","Type":"ContainerDied","Data":"472ba4ff394845060394a53a591417da776a28887417d07a69271e816205f516"} Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.809363 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.838092 4764 scope.go:117] "RemoveContainer" containerID="1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.842552 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.842519384 podStartE2EDuration="2.842519384s" podCreationTimestamp="2025-12-04 01:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:13.842028882 +0000 UTC m=+5649.603353293" watchObservedRunningTime="2025-12-04 01:15:13.842519384 +0000 UTC m=+5649.603843785" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.885543 4764 scope.go:117] "RemoveContainer" containerID="aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.886051 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6\": container with ID starting with aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6 not found: ID does not exist" containerID="aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.886098 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6"} err="failed to get container status \"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6\": rpc error: code = NotFound desc = could not find container \"aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6\": container with ID starting with aeb2130c527b24e9f027656b59a32c8a12e59145948a33759ff0cfb5949ec8f6 not found: ID does not exist" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.886131 4764 scope.go:117] "RemoveContainer" containerID="1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.886504 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324\": container with ID starting with 1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324 not found: ID does not exist" containerID="1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.886538 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324"} err="failed to get container status \"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324\": rpc error: code = NotFound desc = could not find container \"1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324\": container with ID starting with 1d686473f79b11dfa440e562dbf3ad46d9f9d0c893ce0fadb46f2983dffb8324 not found: ID does not exist" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.886566 4764 scope.go:117] "RemoveContainer" containerID="833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.902924 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.915549 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.930533 4764 scope.go:117] "RemoveContainer" containerID="1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.933841 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.950601 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.956888 4764 scope.go:117] "RemoveContainer" containerID="833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.957475 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4\": container with ID starting with 833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4 not found: ID does not exist" containerID="833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.957511 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4"} err="failed to get container status \"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4\": rpc error: code = NotFound desc = could not find container \"833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4\": container with ID starting with 833379c08c2ae9999407e2cff0c512121e4a2ea39696e02765f9a27cde63fde4 not found: ID does not exist" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.957541 4764 scope.go:117] "RemoveContainer" containerID="1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.957923 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd\": container with ID starting with 1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd not found: ID does not exist" containerID="1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.957964 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd"} err="failed to get container status \"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd\": rpc error: code = NotFound desc = could not find container \"1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd\": container with ID starting with 1fe84ab148b3d48c6b6032e8c64f77481de144dbd52b2ba578cc2542eaf272bd not found: ID does not exist" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.960789 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.961203 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-log" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961222 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-log" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.961239 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-log" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961245 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-log" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.961278 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-metadata" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961284 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-metadata" Dec 04 01:15:13 crc kubenswrapper[4764]: E1204 01:15:13.961291 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-api" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961297 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-api" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961456 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-log" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961473 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" containerName="nova-api-api" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961483 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-metadata" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.961490 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" containerName="nova-metadata-log" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.962633 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.965215 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.970292 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.971987 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.973926 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.983859 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:13 crc kubenswrapper[4764]: I1204 01:15:13.994990 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066108 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066201 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066238 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066261 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qxfc\" (UniqueName: \"kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066306 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066336 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc5sk\" (UniqueName: \"kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.066360 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.167809 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168097 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qxfc\" (UniqueName: \"kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168219 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168278 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc5sk\" (UniqueName: \"kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.168331 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.170844 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.171829 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.187029 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.187220 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.196558 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.196786 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.220568 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc5sk\" (UniqueName: \"kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk\") pod \"nova-api-0\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.220589 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qxfc\" (UniqueName: \"kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc\") pod \"nova-metadata-0\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.297273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.304157 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.561374 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d2cba0-a1b9-4974-b37f-0c0c979543cf" path="/var/lib/kubelet/pods/09d2cba0-a1b9-4974-b37f-0c0c979543cf/volumes" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.562439 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c655b87d-83a0-4c58-adcf-c65d5364864b" path="/var/lib/kubelet/pods/c655b87d-83a0-4c58-adcf-c65d5364864b/volumes" Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.826691 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:15:14 crc kubenswrapper[4764]: I1204 01:15:14.905994 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:15:14 crc kubenswrapper[4764]: W1204 01:15:14.916558 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0f76ede_6459_4fb5_96e6_e4ce0870a2f9.slice/crio-7593f2bdb80ef2aad2532af0035cd35f81116670ce4ed57da362a94f88d83bc9 WatchSource:0}: Error finding container 7593f2bdb80ef2aad2532af0035cd35f81116670ce4ed57da362a94f88d83bc9: Status 404 returned error can't find the container with id 7593f2bdb80ef2aad2532af0035cd35f81116670ce4ed57da362a94f88d83bc9 Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.835196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerStarted","Data":"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.835627 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerStarted","Data":"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.835651 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerStarted","Data":"b1b03297656ebbadfc99d6fdee5854ab869559322e4e2482bfb1c3d63b57cb3a"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.838262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerStarted","Data":"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.838297 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerStarted","Data":"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.838311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerStarted","Data":"7593f2bdb80ef2aad2532af0035cd35f81116670ce4ed57da362a94f88d83bc9"} Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.854534 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.854520102 podStartE2EDuration="2.854520102s" podCreationTimestamp="2025-12-04 01:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:15.853057046 +0000 UTC m=+5651.614381497" watchObservedRunningTime="2025-12-04 01:15:15.854520102 +0000 UTC m=+5651.615844513" Dec 04 01:15:15 crc kubenswrapper[4764]: I1204 01:15:15.876693 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.876668177 podStartE2EDuration="2.876668177s" podCreationTimestamp="2025-12-04 01:15:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:15.871754786 +0000 UTC m=+5651.633079227" watchObservedRunningTime="2025-12-04 01:15:15.876668177 +0000 UTC m=+5651.637992598" Dec 04 01:15:17 crc kubenswrapper[4764]: I1204 01:15:17.193007 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 01:15:19 crc kubenswrapper[4764]: I1204 01:15:19.305455 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:15:19 crc kubenswrapper[4764]: I1204 01:15:19.306054 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:15:22 crc kubenswrapper[4764]: I1204 01:15:22.193943 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 01:15:22 crc kubenswrapper[4764]: I1204 01:15:22.231524 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 01:15:22 crc kubenswrapper[4764]: I1204 01:15:22.947978 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 01:15:24 crc kubenswrapper[4764]: I1204 01:15:24.298746 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:15:24 crc kubenswrapper[4764]: I1204 01:15:24.299270 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:15:24 crc kubenswrapper[4764]: I1204 01:15:24.305461 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:15:24 crc kubenswrapper[4764]: I1204 01:15:24.305533 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:15:25 crc kubenswrapper[4764]: I1204 01:15:25.463251 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:25 crc kubenswrapper[4764]: I1204 01:15:25.463374 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:25 crc kubenswrapper[4764]: I1204 01:15:25.463443 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:25 crc kubenswrapper[4764]: I1204 01:15:25.464349 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.303313 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.304262 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.304310 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.308832 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.310705 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.312074 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 01:15:34 crc kubenswrapper[4764]: I1204 01:15:34.312217 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.039470 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.042812 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.044731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.281461 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.284864 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.326398 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.444565 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4qv\" (UniqueName: \"kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.444627 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.444667 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.444685 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.444738 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.545776 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.546114 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4qv\" (UniqueName: \"kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.546207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.546291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.546380 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.546493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.547141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.547367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.547367 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.563963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4qv\" (UniqueName: \"kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv\") pod \"dnsmasq-dns-6c85449d59-8qpmd\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:35 crc kubenswrapper[4764]: I1204 01:15:35.613907 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:36 crc kubenswrapper[4764]: I1204 01:15:36.118999 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:15:37 crc kubenswrapper[4764]: I1204 01:15:37.060738 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerID="331cef01496d133b0cb46f2168c5da7176b2e553cca35fb0ea16a8cdc6544ae3" exitCode=0 Dec 04 01:15:37 crc kubenswrapper[4764]: I1204 01:15:37.060788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" event={"ID":"ac206e4d-b49d-469e-b67f-b5a0fbc286f6","Type":"ContainerDied","Data":"331cef01496d133b0cb46f2168c5da7176b2e553cca35fb0ea16a8cdc6544ae3"} Dec 04 01:15:37 crc kubenswrapper[4764]: I1204 01:15:37.061272 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" event={"ID":"ac206e4d-b49d-469e-b67f-b5a0fbc286f6","Type":"ContainerStarted","Data":"b6e826ad9e860d800a8f08d885def714e2d5ade0340f7d2b2cc7759a896aabfc"} Dec 04 01:15:38 crc kubenswrapper[4764]: I1204 01:15:38.071226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" event={"ID":"ac206e4d-b49d-469e-b67f-b5a0fbc286f6","Type":"ContainerStarted","Data":"6d9af917a3d4b8889868c1136dd43595052c83c86a7446eaca70a3cca36ab12d"} Dec 04 01:15:38 crc kubenswrapper[4764]: I1204 01:15:38.071616 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:38 crc kubenswrapper[4764]: I1204 01:15:38.094920 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" podStartSLOduration=3.094901837 podStartE2EDuration="3.094901837s" podCreationTimestamp="2025-12-04 01:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:38.091877703 +0000 UTC m=+5673.853202124" watchObservedRunningTime="2025-12-04 01:15:38.094901837 +0000 UTC m=+5673.856226248" Dec 04 01:15:45 crc kubenswrapper[4764]: I1204 01:15:45.615273 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:15:45 crc kubenswrapper[4764]: I1204 01:15:45.698223 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:15:45 crc kubenswrapper[4764]: I1204 01:15:45.698664 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="dnsmasq-dns" containerID="cri-o://edd93145d1bedd3c82c160d5efebed40d403e24ca31b5af3b034c6c490a32b42" gracePeriod=10 Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.173224 4764 generic.go:334] "Generic (PLEG): container finished" podID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerID="edd93145d1bedd3c82c160d5efebed40d403e24ca31b5af3b034c6c490a32b42" exitCode=0 Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.173301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" event={"ID":"e1f2ea3d-8c91-45e3-92f8-87308627efeb","Type":"ContainerDied","Data":"edd93145d1bedd3c82c160d5efebed40d403e24ca31b5af3b034c6c490a32b42"} Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.173536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" event={"ID":"e1f2ea3d-8c91-45e3-92f8-87308627efeb","Type":"ContainerDied","Data":"93835aa8d35e3186ad31513169a748e50a7d91c0bdc366c5fce1278d185945d4"} Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.173553 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93835aa8d35e3186ad31513169a748e50a7d91c0bdc366c5fce1278d185945d4" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.186058 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.302832 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc\") pod \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.302968 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jqld\" (UniqueName: \"kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld\") pod \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.304397 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb\") pod \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.304501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config\") pod \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.304665 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb\") pod \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\" (UID: \"e1f2ea3d-8c91-45e3-92f8-87308627efeb\") " Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.316952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld" (OuterVolumeSpecName: "kube-api-access-9jqld") pod "e1f2ea3d-8c91-45e3-92f8-87308627efeb" (UID: "e1f2ea3d-8c91-45e3-92f8-87308627efeb"). InnerVolumeSpecName "kube-api-access-9jqld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.362137 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1f2ea3d-8c91-45e3-92f8-87308627efeb" (UID: "e1f2ea3d-8c91-45e3-92f8-87308627efeb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.377210 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1f2ea3d-8c91-45e3-92f8-87308627efeb" (UID: "e1f2ea3d-8c91-45e3-92f8-87308627efeb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.385118 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config" (OuterVolumeSpecName: "config") pod "e1f2ea3d-8c91-45e3-92f8-87308627efeb" (UID: "e1f2ea3d-8c91-45e3-92f8-87308627efeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.387199 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1f2ea3d-8c91-45e3-92f8-87308627efeb" (UID: "e1f2ea3d-8c91-45e3-92f8-87308627efeb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.407629 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.409294 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.409394 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jqld\" (UniqueName: \"kubernetes.io/projected/e1f2ea3d-8c91-45e3-92f8-87308627efeb-kube-api-access-9jqld\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.409475 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:46 crc kubenswrapper[4764]: I1204 01:15:46.409630 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1f2ea3d-8c91-45e3-92f8-87308627efeb-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.186122 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f499f49d9-4dq6j" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.234004 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.245862 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f499f49d9-4dq6j"] Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.983308 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-bfttz"] Dec 04 01:15:47 crc kubenswrapper[4764]: E1204 01:15:47.983950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="init" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.984036 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="init" Dec 04 01:15:47 crc kubenswrapper[4764]: E1204 01:15:47.984096 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="dnsmasq-dns" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.984147 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="dnsmasq-dns" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.984362 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" containerName="dnsmasq-dns" Dec 04 01:15:47 crc kubenswrapper[4764]: I1204 01:15:47.985013 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.001081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bfttz"] Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.087805 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8f64-account-create-update-phfqx"] Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.089225 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.091627 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.097630 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8f64-account-create-update-phfqx"] Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.153763 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.154022 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxrg5\" (UniqueName: \"kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.255562 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.255700 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlzcw\" (UniqueName: \"kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.255751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxrg5\" (UniqueName: \"kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.255941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.257250 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.275334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxrg5\" (UniqueName: \"kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5\") pod \"cinder-db-create-bfttz\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.302654 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.358477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlzcw\" (UniqueName: \"kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.358834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.359666 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.386224 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlzcw\" (UniqueName: \"kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw\") pod \"cinder-8f64-account-create-update-phfqx\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.416189 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.591410 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1f2ea3d-8c91-45e3-92f8-87308627efeb" path="/var/lib/kubelet/pods/e1f2ea3d-8c91-45e3-92f8-87308627efeb/volumes" Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.763219 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-bfttz"] Dec 04 01:15:48 crc kubenswrapper[4764]: I1204 01:15:48.893860 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8f64-account-create-update-phfqx"] Dec 04 01:15:48 crc kubenswrapper[4764]: W1204 01:15:48.900265 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e9ca50e_771a_4f4a_94ef_2a54aeb2a83d.slice/crio-1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663 WatchSource:0}: Error finding container 1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663: Status 404 returned error can't find the container with id 1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663 Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.204608 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8f64-account-create-update-phfqx" event={"ID":"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d","Type":"ContainerStarted","Data":"370a8648c8b29a2d57c0cd6b73bf6e54e4feb331d5b86df4b7de1975d039f6a2"} Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.204659 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8f64-account-create-update-phfqx" event={"ID":"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d","Type":"ContainerStarted","Data":"1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663"} Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.206573 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" containerID="c57e6dd1f938f64ce27864009cffb12191dc4c512f1ee10f7f4449a09a5e11ae" exitCode=0 Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.206619 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bfttz" event={"ID":"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac","Type":"ContainerDied","Data":"c57e6dd1f938f64ce27864009cffb12191dc4c512f1ee10f7f4449a09a5e11ae"} Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.206645 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bfttz" event={"ID":"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac","Type":"ContainerStarted","Data":"759a8cdcc5a16a7e7f1364ea1d8abfad3d85ca013e979c53ec586bf28411d9c1"} Dec 04 01:15:49 crc kubenswrapper[4764]: I1204 01:15:49.228412 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8f64-account-create-update-phfqx" podStartSLOduration=1.22837528 podStartE2EDuration="1.22837528s" podCreationTimestamp="2025-12-04 01:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:49.220511216 +0000 UTC m=+5684.981835627" watchObservedRunningTime="2025-12-04 01:15:49.22837528 +0000 UTC m=+5684.989699691" Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.219229 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" containerID="370a8648c8b29a2d57c0cd6b73bf6e54e4feb331d5b86df4b7de1975d039f6a2" exitCode=0 Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.219338 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8f64-account-create-update-phfqx" event={"ID":"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d","Type":"ContainerDied","Data":"370a8648c8b29a2d57c0cd6b73bf6e54e4feb331d5b86df4b7de1975d039f6a2"} Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.693200 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.823163 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts\") pod \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.823473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxrg5\" (UniqueName: \"kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5\") pod \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\" (UID: \"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac\") " Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.824450 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" (UID: "3f825b0f-a1a7-4fc7-94f2-44e076ada8ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.830239 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5" (OuterVolumeSpecName: "kube-api-access-fxrg5") pod "3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" (UID: "3f825b0f-a1a7-4fc7-94f2-44e076ada8ac"). InnerVolumeSpecName "kube-api-access-fxrg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.925944 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:50 crc kubenswrapper[4764]: I1204 01:15:50.925991 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxrg5\" (UniqueName: \"kubernetes.io/projected/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac-kube-api-access-fxrg5\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.233609 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-bfttz" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.234756 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-bfttz" event={"ID":"3f825b0f-a1a7-4fc7-94f2-44e076ada8ac","Type":"ContainerDied","Data":"759a8cdcc5a16a7e7f1364ea1d8abfad3d85ca013e979c53ec586bf28411d9c1"} Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.234802 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="759a8cdcc5a16a7e7f1364ea1d8abfad3d85ca013e979c53ec586bf28411d9c1" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.627707 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.740741 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlzcw\" (UniqueName: \"kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw\") pod \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.740867 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts\") pod \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\" (UID: \"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d\") " Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.741383 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" (UID: "0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.741779 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.743847 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw" (OuterVolumeSpecName: "kube-api-access-xlzcw") pod "0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" (UID: "0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d"). InnerVolumeSpecName "kube-api-access-xlzcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:51 crc kubenswrapper[4764]: I1204 01:15:51.843027 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlzcw\" (UniqueName: \"kubernetes.io/projected/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d-kube-api-access-xlzcw\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:52 crc kubenswrapper[4764]: I1204 01:15:52.253387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8f64-account-create-update-phfqx" event={"ID":"0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d","Type":"ContainerDied","Data":"1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663"} Dec 04 01:15:52 crc kubenswrapper[4764]: I1204 01:15:52.253728 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1aadac31d85d4e9d28331beee8d4492f6eb42bb1f98e3bcc5904a9473ea3c663" Dec 04 01:15:52 crc kubenswrapper[4764]: I1204 01:15:52.253794 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8f64-account-create-update-phfqx" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.250755 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ds7ng"] Dec 04 01:15:53 crc kubenswrapper[4764]: E1204 01:15:53.251162 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" containerName="mariadb-account-create-update" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.251183 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" containerName="mariadb-account-create-update" Dec 04 01:15:53 crc kubenswrapper[4764]: E1204 01:15:53.251224 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" containerName="mariadb-database-create" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.251233 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" containerName="mariadb-database-create" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.251444 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" containerName="mariadb-account-create-update" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.251469 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" containerName="mariadb-database-create" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.252205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.254438 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.255689 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.256243 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qxc88" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.268055 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ds7ng"] Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371430 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371493 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-452j7\" (UniqueName: \"kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371779 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.371883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473238 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473348 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473392 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-452j7\" (UniqueName: \"kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473450 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473458 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.473526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.479160 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.479185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.479404 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.479775 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.497135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-452j7\" (UniqueName: \"kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7\") pod \"cinder-db-sync-ds7ng\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:53 crc kubenswrapper[4764]: I1204 01:15:53.619609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:54 crc kubenswrapper[4764]: I1204 01:15:54.211983 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ds7ng"] Dec 04 01:15:54 crc kubenswrapper[4764]: I1204 01:15:54.280829 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ds7ng" event={"ID":"c543fb01-7118-455a-a07b-2618a4c0368a","Type":"ContainerStarted","Data":"aa3e63d8323da54eb18e53c8450ec2cfa799c916b69bf086715c8077c80d60b6"} Dec 04 01:15:55 crc kubenswrapper[4764]: I1204 01:15:55.293441 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ds7ng" event={"ID":"c543fb01-7118-455a-a07b-2618a4c0368a","Type":"ContainerStarted","Data":"41ea1f1a60e912880372494e25f2c8e40bee298af1458850dfd793690d496454"} Dec 04 01:15:55 crc kubenswrapper[4764]: I1204 01:15:55.321849 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ds7ng" podStartSLOduration=2.321828219 podStartE2EDuration="2.321828219s" podCreationTimestamp="2025-12-04 01:15:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:15:55.312474859 +0000 UTC m=+5691.073799280" watchObservedRunningTime="2025-12-04 01:15:55.321828219 +0000 UTC m=+5691.083152630" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.134565 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.140169 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.149769 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.232797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.232884 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czst7\" (UniqueName: \"kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.233219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.335280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.335347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czst7\" (UniqueName: \"kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.335436 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.336057 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.336360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.356955 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czst7\" (UniqueName: \"kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7\") pod \"redhat-operators-x5lqm\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.466778 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:15:56 crc kubenswrapper[4764]: I1204 01:15:56.921566 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:15:57 crc kubenswrapper[4764]: I1204 01:15:57.312766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerDied","Data":"e71e0e23d1d34976d692f04ebbefed05d9c60f0daf0f2841e60d8ab204ba9d6e"} Dec 04 01:15:57 crc kubenswrapper[4764]: I1204 01:15:57.312402 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerID="e71e0e23d1d34976d692f04ebbefed05d9c60f0daf0f2841e60d8ab204ba9d6e" exitCode=0 Dec 04 01:15:57 crc kubenswrapper[4764]: I1204 01:15:57.313281 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerStarted","Data":"b20bec09e781958a507c82c657343e4afe0025af3e3b5f8ee68d83a23341dc17"} Dec 04 01:15:57 crc kubenswrapper[4764]: I1204 01:15:57.314011 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:15:58 crc kubenswrapper[4764]: I1204 01:15:58.330697 4764 generic.go:334] "Generic (PLEG): container finished" podID="c543fb01-7118-455a-a07b-2618a4c0368a" containerID="41ea1f1a60e912880372494e25f2c8e40bee298af1458850dfd793690d496454" exitCode=0 Dec 04 01:15:58 crc kubenswrapper[4764]: I1204 01:15:58.330783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ds7ng" event={"ID":"c543fb01-7118-455a-a07b-2618a4c0368a","Type":"ContainerDied","Data":"41ea1f1a60e912880372494e25f2c8e40bee298af1458850dfd793690d496454"} Dec 04 01:15:58 crc kubenswrapper[4764]: I1204 01:15:58.334607 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerStarted","Data":"b560cff1b84473af801fae5268b0f3ace4a3391a2ce59335e2f5959e2f9259da"} Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.349365 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerID="b560cff1b84473af801fae5268b0f3ace4a3391a2ce59335e2f5959e2f9259da" exitCode=0 Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.349467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerDied","Data":"b560cff1b84473af801fae5268b0f3ace4a3391a2ce59335e2f5959e2f9259da"} Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.771455 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903235 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903425 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903464 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903513 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-452j7\" (UniqueName: \"kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903566 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903628 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle\") pod \"c543fb01-7118-455a-a07b-2618a4c0368a\" (UID: \"c543fb01-7118-455a-a07b-2618a4c0368a\") " Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.903636 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.904220 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c543fb01-7118-455a-a07b-2618a4c0368a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.908828 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7" (OuterVolumeSpecName: "kube-api-access-452j7") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "kube-api-access-452j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.916090 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts" (OuterVolumeSpecName: "scripts") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.922282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.933446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:15:59 crc kubenswrapper[4764]: I1204 01:15:59.997995 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data" (OuterVolumeSpecName: "config-data") pod "c543fb01-7118-455a-a07b-2618a4c0368a" (UID: "c543fb01-7118-455a-a07b-2618a4c0368a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.006332 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.006404 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-452j7\" (UniqueName: \"kubernetes.io/projected/c543fb01-7118-455a-a07b-2618a4c0368a-kube-api-access-452j7\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.006434 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.006453 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.006472 4764 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c543fb01-7118-455a-a07b-2618a4c0368a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.370419 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ds7ng" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.370435 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ds7ng" event={"ID":"c543fb01-7118-455a-a07b-2618a4c0368a","Type":"ContainerDied","Data":"aa3e63d8323da54eb18e53c8450ec2cfa799c916b69bf086715c8077c80d60b6"} Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.370533 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa3e63d8323da54eb18e53c8450ec2cfa799c916b69bf086715c8077c80d60b6" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.373255 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerStarted","Data":"405cef412c34a086422191c5f991ecaa2f8748317bf920bf6c73c057f88387ff"} Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.403337 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x5lqm" podStartSLOduration=1.8519118799999998 podStartE2EDuration="4.403316626s" podCreationTimestamp="2025-12-04 01:15:56 +0000 UTC" firstStartedPulling="2025-12-04 01:15:57.313811744 +0000 UTC m=+5693.075136155" lastFinishedPulling="2025-12-04 01:15:59.86521646 +0000 UTC m=+5695.626540901" observedRunningTime="2025-12-04 01:16:00.391098306 +0000 UTC m=+5696.152422717" watchObservedRunningTime="2025-12-04 01:16:00.403316626 +0000 UTC m=+5696.164641047" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.712607 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:16:00 crc kubenswrapper[4764]: E1204 01:16:00.722274 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c543fb01-7118-455a-a07b-2618a4c0368a" containerName="cinder-db-sync" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.722524 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c543fb01-7118-455a-a07b-2618a4c0368a" containerName="cinder-db-sync" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.722933 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c543fb01-7118-455a-a07b-2618a4c0368a" containerName="cinder-db-sync" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.724319 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.725797 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.821570 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86trz\" (UniqueName: \"kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.821987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.822043 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.822120 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.822287 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.875643 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.877303 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.882849 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.883063 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.883166 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qxc88" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.884927 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.893779 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.924066 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.924138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86trz\" (UniqueName: \"kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.924187 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.924222 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.924254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.925193 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.925681 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.926191 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.934035 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:00 crc kubenswrapper[4764]: I1204 01:16:00.948485 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86trz\" (UniqueName: \"kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz\") pod \"dnsmasq-dns-7bd4b7497f-nls86\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.025198 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.025373 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.025449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.025593 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.025784 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.026048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgmhz\" (UniqueName: \"kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.026082 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.062244 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.127914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.127972 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgmhz\" (UniqueName: \"kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128091 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.128621 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.139993 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.140481 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.140529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.142682 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.154047 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgmhz\" (UniqueName: \"kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz\") pod \"cinder-api-0\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.200786 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.539455 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:16:01 crc kubenswrapper[4764]: W1204 01:16:01.540239 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27f4424a_d562_42ae_aaac_879944c2134b.slice/crio-29eeaa7021392724baeabb350949b3822d6ceee8a2e5c7112cdf4e195110435a WatchSource:0}: Error finding container 29eeaa7021392724baeabb350949b3822d6ceee8a2e5c7112cdf4e195110435a: Status 404 returned error can't find the container with id 29eeaa7021392724baeabb350949b3822d6ceee8a2e5c7112cdf4e195110435a Dec 04 01:16:01 crc kubenswrapper[4764]: I1204 01:16:01.714306 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:01 crc kubenswrapper[4764]: W1204 01:16:01.720262 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73ded4db_32b3_401a_9c1c_4751895ce624.slice/crio-644cb85a54772dae8cbbe4857a8baa5fa821cea3717d51c4ddf7fd88dc6cdb02 WatchSource:0}: Error finding container 644cb85a54772dae8cbbe4857a8baa5fa821cea3717d51c4ddf7fd88dc6cdb02: Status 404 returned error can't find the container with id 644cb85a54772dae8cbbe4857a8baa5fa821cea3717d51c4ddf7fd88dc6cdb02 Dec 04 01:16:02 crc kubenswrapper[4764]: I1204 01:16:02.410095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerStarted","Data":"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b"} Dec 04 01:16:02 crc kubenswrapper[4764]: I1204 01:16:02.410843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerStarted","Data":"644cb85a54772dae8cbbe4857a8baa5fa821cea3717d51c4ddf7fd88dc6cdb02"} Dec 04 01:16:02 crc kubenswrapper[4764]: I1204 01:16:02.412283 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f4424a-d562-42ae-aaac-879944c2134b" containerID="3a918550b44845f01662d4bd4a87f09926fde95ff6eb29fd60f4ac6a9896bd6a" exitCode=0 Dec 04 01:16:02 crc kubenswrapper[4764]: I1204 01:16:02.412311 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" event={"ID":"27f4424a-d562-42ae-aaac-879944c2134b","Type":"ContainerDied","Data":"3a918550b44845f01662d4bd4a87f09926fde95ff6eb29fd60f4ac6a9896bd6a"} Dec 04 01:16:02 crc kubenswrapper[4764]: I1204 01:16:02.412325 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" event={"ID":"27f4424a-d562-42ae-aaac-879944c2134b","Type":"ContainerStarted","Data":"29eeaa7021392724baeabb350949b3822d6ceee8a2e5c7112cdf4e195110435a"} Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.430971 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerStarted","Data":"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3"} Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.432341 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.434772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" event={"ID":"27f4424a-d562-42ae-aaac-879944c2134b","Type":"ContainerStarted","Data":"1ce0ec671b8a052db3202bacd52e10f2b29f63289079f6235e6dfd6550497690"} Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.435304 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.453384 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.453368187 podStartE2EDuration="3.453368187s" podCreationTimestamp="2025-12-04 01:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:03.447382 +0000 UTC m=+5699.208706411" watchObservedRunningTime="2025-12-04 01:16:03.453368187 +0000 UTC m=+5699.214692598" Dec 04 01:16:03 crc kubenswrapper[4764]: I1204 01:16:03.469071 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" podStartSLOduration=3.469051313 podStartE2EDuration="3.469051313s" podCreationTimestamp="2025-12-04 01:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:03.464773428 +0000 UTC m=+5699.226097839" watchObservedRunningTime="2025-12-04 01:16:03.469051313 +0000 UTC m=+5699.230375744" Dec 04 01:16:06 crc kubenswrapper[4764]: I1204 01:16:06.467694 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:06 crc kubenswrapper[4764]: I1204 01:16:06.469061 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:06 crc kubenswrapper[4764]: I1204 01:16:06.541393 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:07 crc kubenswrapper[4764]: I1204 01:16:07.522359 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:07 crc kubenswrapper[4764]: I1204 01:16:07.580870 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:16:09 crc kubenswrapper[4764]: I1204 01:16:09.495304 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x5lqm" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="registry-server" containerID="cri-o://405cef412c34a086422191c5f991ecaa2f8748317bf920bf6c73c057f88387ff" gracePeriod=2 Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.063844 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.119146 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.119375 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="dnsmasq-dns" containerID="cri-o://6d9af917a3d4b8889868c1136dd43595052c83c86a7446eaca70a3cca36ab12d" gracePeriod=10 Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.605688 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerID="405cef412c34a086422191c5f991ecaa2f8748317bf920bf6c73c057f88387ff" exitCode=0 Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.606114 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerDied","Data":"405cef412c34a086422191c5f991ecaa2f8748317bf920bf6c73c057f88387ff"} Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.625953 4764 generic.go:334] "Generic (PLEG): container finished" podID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerID="6d9af917a3d4b8889868c1136dd43595052c83c86a7446eaca70a3cca36ab12d" exitCode=0 Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.625999 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" event={"ID":"ac206e4d-b49d-469e-b67f-b5a0fbc286f6","Type":"ContainerDied","Data":"6d9af917a3d4b8889868c1136dd43595052c83c86a7446eaca70a3cca36ab12d"} Dec 04 01:16:11 crc kubenswrapper[4764]: I1204 01:16:11.971134 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.066871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities\") pod \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.066933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content\") pod \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.066995 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czst7\" (UniqueName: \"kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7\") pod \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\" (UID: \"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.068896 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities" (OuterVolumeSpecName: "utilities") pod "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" (UID: "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.089045 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7" (OuterVolumeSpecName: "kube-api-access-czst7") pod "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" (UID: "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0"). InnerVolumeSpecName "kube-api-access-czst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.169031 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.169061 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czst7\" (UniqueName: \"kubernetes.io/projected/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-kube-api-access-czst7\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.190757 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.196681 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" (UID: "7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.270392 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.371912 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb\") pod \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.372020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config\") pod \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.372062 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb\") pod \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.372095 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf4qv\" (UniqueName: \"kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv\") pod \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.372124 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc\") pod \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\" (UID: \"ac206e4d-b49d-469e-b67f-b5a0fbc286f6\") " Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.391389 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv" (OuterVolumeSpecName: "kube-api-access-zf4qv") pod "ac206e4d-b49d-469e-b67f-b5a0fbc286f6" (UID: "ac206e4d-b49d-469e-b67f-b5a0fbc286f6"). InnerVolumeSpecName "kube-api-access-zf4qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.412040 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac206e4d-b49d-469e-b67f-b5a0fbc286f6" (UID: "ac206e4d-b49d-469e-b67f-b5a0fbc286f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.420894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ac206e4d-b49d-469e-b67f-b5a0fbc286f6" (UID: "ac206e4d-b49d-469e-b67f-b5a0fbc286f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.433852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config" (OuterVolumeSpecName: "config") pod "ac206e4d-b49d-469e-b67f-b5a0fbc286f6" (UID: "ac206e4d-b49d-469e-b67f-b5a0fbc286f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.435003 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ac206e4d-b49d-469e-b67f-b5a0fbc286f6" (UID: "ac206e4d-b49d-469e-b67f-b5a0fbc286f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.473957 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.473995 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.474006 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.474018 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf4qv\" (UniqueName: \"kubernetes.io/projected/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-kube-api-access-zf4qv\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.474031 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac206e4d-b49d-469e-b67f-b5a0fbc286f6-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.638378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5lqm" event={"ID":"7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0","Type":"ContainerDied","Data":"b20bec09e781958a507c82c657343e4afe0025af3e3b5f8ee68d83a23341dc17"} Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.638448 4764 scope.go:117] "RemoveContainer" containerID="405cef412c34a086422191c5f991ecaa2f8748317bf920bf6c73c057f88387ff" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.638618 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5lqm" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.644399 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" event={"ID":"ac206e4d-b49d-469e-b67f-b5a0fbc286f6","Type":"ContainerDied","Data":"b6e826ad9e860d800a8f08d885def714e2d5ade0340f7d2b2cc7759a896aabfc"} Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.644451 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c85449d59-8qpmd" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.689924 4764 scope.go:117] "RemoveContainer" containerID="b560cff1b84473af801fae5268b0f3ace4a3391a2ce59335e2f5959e2f9259da" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.706315 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.732309 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x5lqm"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.753181 4764 scope.go:117] "RemoveContainer" containerID="e71e0e23d1d34976d692f04ebbefed05d9c60f0daf0f2841e60d8ab204ba9d6e" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.757753 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.763956 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c85449d59-8qpmd"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.772973 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.775965 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" containerID="cri-o://4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.776577 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" containerID="cri-o://fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.780236 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.780531 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerName="nova-scheduler-scheduler" containerID="cri-o://92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.787897 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.788186 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" containerID="cri-o://cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.788359 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" containerID="cri-o://f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.795262 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.795524 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e16b3743-1815-4473-84f5-0cb21a1bebee" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.803590 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.803806 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f" gracePeriod=30 Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.822926 4764 scope.go:117] "RemoveContainer" containerID="6d9af917a3d4b8889868c1136dd43595052c83c86a7446eaca70a3cca36ab12d" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.855151 4764 scope.go:117] "RemoveContainer" containerID="331cef01496d133b0cb46f2168c5da7176b2e553cca35fb0ea16a8cdc6544ae3" Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.901630 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:12 crc kubenswrapper[4764]: I1204 01:16:12.901898 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="15ee17b2-31d9-499e-aea4-713c272534f8" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a" gracePeriod=30 Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.471620 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.649253 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.658244 4764 generic.go:334] "Generic (PLEG): container finished" podID="74566398-f898-4c72-bfbf-b77bdf395b33" containerID="4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2" exitCode=143 Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.658312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerDied","Data":"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2"} Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.665092 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerID="cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974" exitCode=143 Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.665161 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerDied","Data":"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974"} Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.685322 4764 generic.go:334] "Generic (PLEG): container finished" podID="e16b3743-1815-4473-84f5-0cb21a1bebee" containerID="ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1" exitCode=0 Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.685362 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.685380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16b3743-1815-4473-84f5-0cb21a1bebee","Type":"ContainerDied","Data":"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1"} Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.685420 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e16b3743-1815-4473-84f5-0cb21a1bebee","Type":"ContainerDied","Data":"2cc2f7efbe318a18fd099738bd2602f333d221a4ce5bc4b41ba833cc08a82b31"} Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.685440 4764 scope.go:117] "RemoveContainer" containerID="ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.715887 4764 scope.go:117] "RemoveContainer" containerID="ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1" Dec 04 01:16:13 crc kubenswrapper[4764]: E1204 01:16:13.716482 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1\": container with ID starting with ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1 not found: ID does not exist" containerID="ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.716521 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1"} err="failed to get container status \"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1\": rpc error: code = NotFound desc = could not find container \"ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1\": container with ID starting with ded303661c46e020b319ae8f7769228f499c650bb6b24134726ba0062879d6e1 not found: ID does not exist" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.803041 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data\") pod \"e16b3743-1815-4473-84f5-0cb21a1bebee\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.803120 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle\") pod \"e16b3743-1815-4473-84f5-0cb21a1bebee\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.803171 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvhfw\" (UniqueName: \"kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw\") pod \"e16b3743-1815-4473-84f5-0cb21a1bebee\" (UID: \"e16b3743-1815-4473-84f5-0cb21a1bebee\") " Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.822183 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw" (OuterVolumeSpecName: "kube-api-access-vvhfw") pod "e16b3743-1815-4473-84f5-0cb21a1bebee" (UID: "e16b3743-1815-4473-84f5-0cb21a1bebee"). InnerVolumeSpecName "kube-api-access-vvhfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.833751 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data" (OuterVolumeSpecName: "config-data") pod "e16b3743-1815-4473-84f5-0cb21a1bebee" (UID: "e16b3743-1815-4473-84f5-0cb21a1bebee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.840262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e16b3743-1815-4473-84f5-0cb21a1bebee" (UID: "e16b3743-1815-4473-84f5-0cb21a1bebee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.905950 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.905993 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e16b3743-1815-4473-84f5-0cb21a1bebee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:13 crc kubenswrapper[4764]: I1204 01:16:13.906007 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvhfw\" (UniqueName: \"kubernetes.io/projected/e16b3743-1815-4473-84f5-0cb21a1bebee-kube-api-access-vvhfw\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.031379 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.042201 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069049 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069470 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e16b3743-1815-4473-84f5-0cb21a1bebee" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069490 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e16b3743-1815-4473-84f5-0cb21a1bebee" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069506 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="extract-utilities" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069515 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="extract-utilities" Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069534 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="init" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069543 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="init" Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069559 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="registry-server" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069567 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="registry-server" Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069582 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="extract-content" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069590 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="extract-content" Dec 04 01:16:14 crc kubenswrapper[4764]: E1204 01:16:14.069610 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="dnsmasq-dns" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069619 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="dnsmasq-dns" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069844 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" containerName="dnsmasq-dns" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069872 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" containerName="registry-server" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.069895 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e16b3743-1815-4473-84f5-0cb21a1bebee" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.070589 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.072931 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.106147 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.217105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.217290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.217349 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkfwb\" (UniqueName: \"kubernetes.io/projected/d816b6df-2de6-4e61-9612-613ec427bd48-kube-api-access-wkfwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.319031 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.319365 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkfwb\" (UniqueName: \"kubernetes.io/projected/d816b6df-2de6-4e61-9612-613ec427bd48-kube-api-access-wkfwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.319451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.323732 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.325348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d816b6df-2de6-4e61-9612-613ec427bd48-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.335811 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkfwb\" (UniqueName: \"kubernetes.io/projected/d816b6df-2de6-4e61-9612-613ec427bd48-kube-api-access-wkfwb\") pod \"nova-cell1-novncproxy-0\" (UID: \"d816b6df-2de6-4e61-9612-613ec427bd48\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.426901 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.558904 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0" path="/var/lib/kubelet/pods/7bb22d3d-77c2-4dc3-98d1-0ad9e9ce2bb0/volumes" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.559547 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac206e4d-b49d-469e-b67f-b5a0fbc286f6" path="/var/lib/kubelet/pods/ac206e4d-b49d-469e-b67f-b5a0fbc286f6/volumes" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.560116 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16b3743-1815-4473-84f5-0cb21a1bebee" path="/var/lib/kubelet/pods/e16b3743-1815-4473-84f5-0cb21a1bebee/volumes" Dec 04 01:16:14 crc kubenswrapper[4764]: I1204 01:16:14.908306 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.213656 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.338495 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6267\" (UniqueName: \"kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267\") pod \"15ee17b2-31d9-499e-aea4-713c272534f8\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.338605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle\") pod \"15ee17b2-31d9-499e-aea4-713c272534f8\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.338646 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data\") pod \"15ee17b2-31d9-499e-aea4-713c272534f8\" (UID: \"15ee17b2-31d9-499e-aea4-713c272534f8\") " Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.344970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267" (OuterVolumeSpecName: "kube-api-access-g6267") pod "15ee17b2-31d9-499e-aea4-713c272534f8" (UID: "15ee17b2-31d9-499e-aea4-713c272534f8"). InnerVolumeSpecName "kube-api-access-g6267". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.366094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15ee17b2-31d9-499e-aea4-713c272534f8" (UID: "15ee17b2-31d9-499e-aea4-713c272534f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.380554 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data" (OuterVolumeSpecName: "config-data") pod "15ee17b2-31d9-499e-aea4-713c272534f8" (UID: "15ee17b2-31d9-499e-aea4-713c272534f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.440897 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6267\" (UniqueName: \"kubernetes.io/projected/15ee17b2-31d9-499e-aea4-713c272534f8-kube-api-access-g6267\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.440933 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.440948 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15ee17b2-31d9-499e-aea4-713c272534f8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.717845 4764 generic.go:334] "Generic (PLEG): container finished" podID="15ee17b2-31d9-499e-aea4-713c272534f8" containerID="fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a" exitCode=0 Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.718198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"15ee17b2-31d9-499e-aea4-713c272534f8","Type":"ContainerDied","Data":"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a"} Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.718233 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"15ee17b2-31d9-499e-aea4-713c272534f8","Type":"ContainerDied","Data":"052d738f70a481b8918cc50b263823970dad605d73783e92e656fcf3971b53f0"} Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.718254 4764 scope.go:117] "RemoveContainer" containerID="fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.718394 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.723315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d816b6df-2de6-4e61-9612-613ec427bd48","Type":"ContainerStarted","Data":"2cdcba2f0f2e9c7a3337c4fad750415137f52932ecf3c872ac7559632efca273"} Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.723359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d816b6df-2de6-4e61-9612-613ec427bd48","Type":"ContainerStarted","Data":"0caf74d47de54bd21ddc760d4cd27d0fea75cb992adf9dd4498e61f8b10eb63f"} Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.764476 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.764450052 podStartE2EDuration="1.764450052s" podCreationTimestamp="2025-12-04 01:16:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:15.750356845 +0000 UTC m=+5711.511681256" watchObservedRunningTime="2025-12-04 01:16:15.764450052 +0000 UTC m=+5711.525774473" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.770959 4764 scope.go:117] "RemoveContainer" containerID="fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a" Dec 04 01:16:15 crc kubenswrapper[4764]: E1204 01:16:15.771674 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a\": container with ID starting with fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a not found: ID does not exist" containerID="fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.771701 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a"} err="failed to get container status \"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a\": rpc error: code = NotFound desc = could not find container \"fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a\": container with ID starting with fb513d1f749ab8ee013db98d196ad4ec50f2a9d988132050514cc426c814d27a not found: ID does not exist" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.790269 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.798788 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.816219 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:15 crc kubenswrapper[4764]: E1204 01:16:15.816664 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ee17b2-31d9-499e-aea4-713c272534f8" containerName="nova-cell1-conductor-conductor" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.816690 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ee17b2-31d9-499e-aea4-713c272534f8" containerName="nova-cell1-conductor-conductor" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.816943 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ee17b2-31d9-499e-aea4-713c272534f8" containerName="nova-cell1-conductor-conductor" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.817751 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.820667 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.826331 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.922625 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:42286->10.217.1.73:8775: read: connection reset by peer" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.922671 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.73:8775/\": read tcp 10.217.0.2:42284->10.217.1.73:8775: read: connection reset by peer" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.929862 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": read tcp 10.217.0.2:36536->10.217.1.72:8774: read: connection reset by peer" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.929895 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.72:8774/\": read tcp 10.217.0.2:36540->10.217.1.72:8774: read: connection reset by peer" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.948968 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.949224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxjg\" (UniqueName: \"kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:15 crc kubenswrapper[4764]: I1204 01:16:15.949517 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.056952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxjg\" (UniqueName: \"kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.057059 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.057173 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.067265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.079453 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.082912 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxjg\" (UniqueName: \"kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg\") pod \"nova-cell1-conductor-0\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.138598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.376497 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.537257 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.570424 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qxfc\" (UniqueName: \"kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc\") pod \"74566398-f898-4c72-bfbf-b77bdf395b33\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.570505 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs\") pod \"74566398-f898-4c72-bfbf-b77bdf395b33\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.570528 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle\") pod \"74566398-f898-4c72-bfbf-b77bdf395b33\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.570555 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ee17b2-31d9-499e-aea4-713c272534f8" path="/var/lib/kubelet/pods/15ee17b2-31d9-499e-aea4-713c272534f8/volumes" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.570586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data\") pod \"74566398-f898-4c72-bfbf-b77bdf395b33\" (UID: \"74566398-f898-4c72-bfbf-b77bdf395b33\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.571467 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs" (OuterVolumeSpecName: "logs") pod "74566398-f898-4c72-bfbf-b77bdf395b33" (UID: "74566398-f898-4c72-bfbf-b77bdf395b33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.590196 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc" (OuterVolumeSpecName: "kube-api-access-9qxfc") pod "74566398-f898-4c72-bfbf-b77bdf395b33" (UID: "74566398-f898-4c72-bfbf-b77bdf395b33"). InnerVolumeSpecName "kube-api-access-9qxfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.597860 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74566398-f898-4c72-bfbf-b77bdf395b33" (UID: "74566398-f898-4c72-bfbf-b77bdf395b33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.599091 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data" (OuterVolumeSpecName: "config-data") pod "74566398-f898-4c72-bfbf-b77bdf395b33" (UID: "74566398-f898-4c72-bfbf-b77bdf395b33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.658294 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.672506 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle\") pod \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.672567 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc5sk\" (UniqueName: \"kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk\") pod \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.672724 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs\") pod \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.672763 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data\") pod \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\" (UID: \"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.673166 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qxfc\" (UniqueName: \"kubernetes.io/projected/74566398-f898-4c72-bfbf-b77bdf395b33-kube-api-access-9qxfc\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.673183 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74566398-f898-4c72-bfbf-b77bdf395b33-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.673194 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.673203 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74566398-f898-4c72-bfbf-b77bdf395b33-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.673878 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs" (OuterVolumeSpecName: "logs") pod "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" (UID: "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.690292 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk" (OuterVolumeSpecName: "kube-api-access-vc5sk") pod "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" (UID: "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9"). InnerVolumeSpecName "kube-api-access-vc5sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.705651 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" (UID: "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.706115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data" (OuterVolumeSpecName: "config-data") pod "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" (UID: "c0f76ede-6459-4fb5-96e6-e4ce0870a2f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.738626 4764 generic.go:334] "Generic (PLEG): container finished" podID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" containerID="3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f" exitCode=0 Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.738643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73bd46f9-c2db-40fb-a8bd-bb922f14fef4","Type":"ContainerDied","Data":"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.738690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"73bd46f9-c2db-40fb-a8bd-bb922f14fef4","Type":"ContainerDied","Data":"8b15e507be059cc55182b5ecccdf66e79f0c1aebecae3d40b9c1cbf4667a5586"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.738594 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.738708 4764 scope.go:117] "RemoveContainer" containerID="3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.741351 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.744015 4764 generic.go:334] "Generic (PLEG): container finished" podID="74566398-f898-4c72-bfbf-b77bdf395b33" containerID="fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a" exitCode=0 Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.744101 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.744118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerDied","Data":"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.744189 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"74566398-f898-4c72-bfbf-b77bdf395b33","Type":"ContainerDied","Data":"b1b03297656ebbadfc99d6fdee5854ab869559322e4e2482bfb1c3d63b57cb3a"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.749081 4764 generic.go:334] "Generic (PLEG): container finished" podID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerID="f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081" exitCode=0 Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.749134 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerDied","Data":"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.749157 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c0f76ede-6459-4fb5-96e6-e4ce0870a2f9","Type":"ContainerDied","Data":"7593f2bdb80ef2aad2532af0035cd35f81116670ce4ed57da362a94f88d83bc9"} Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.749244 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.775817 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sm57\" (UniqueName: \"kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57\") pod \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.775964 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data\") pod \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.776009 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle\") pod \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\" (UID: \"73bd46f9-c2db-40fb-a8bd-bb922f14fef4\") " Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.776342 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.776353 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.776361 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.776566 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc5sk\" (UniqueName: \"kubernetes.io/projected/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9-kube-api-access-vc5sk\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.791026 4764 scope.go:117] "RemoveContainer" containerID="3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.798275 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f\": container with ID starting with 3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f not found: ID does not exist" containerID="3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.798560 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f"} err="failed to get container status \"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f\": rpc error: code = NotFound desc = could not find container \"3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f\": container with ID starting with 3cff6706d4ab761eadab2d9b78e0e9c1251cbc4922f7b183b22c6848cc1e036f not found: ID does not exist" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.798595 4764 scope.go:117] "RemoveContainer" containerID="fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.817540 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57" (OuterVolumeSpecName: "kube-api-access-2sm57") pod "73bd46f9-c2db-40fb-a8bd-bb922f14fef4" (UID: "73bd46f9-c2db-40fb-a8bd-bb922f14fef4"). InnerVolumeSpecName "kube-api-access-2sm57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.857277 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data" (OuterVolumeSpecName: "config-data") pod "73bd46f9-c2db-40fb-a8bd-bb922f14fef4" (UID: "73bd46f9-c2db-40fb-a8bd-bb922f14fef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.857399 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73bd46f9-c2db-40fb-a8bd-bb922f14fef4" (UID: "73bd46f9-c2db-40fb-a8bd-bb922f14fef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.864553 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.875023 4764 scope.go:117] "RemoveContainer" containerID="4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.876475 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.883045 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sm57\" (UniqueName: \"kubernetes.io/projected/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-kube-api-access-2sm57\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.883073 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.883082 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73bd46f9-c2db-40fb-a8bd-bb922f14fef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.890833 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.891323 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891341 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.891382 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891391 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.891401 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891409 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.891426 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891432 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.891448 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" containerName="nova-cell0-conductor-conductor" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891454 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" containerName="nova-cell0-conductor-conductor" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891637 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-log" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891652 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" containerName="nova-metadata-metadata" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891673 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-log" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891689 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" containerName="nova-api-api" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.891695 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" containerName="nova-cell0-conductor-conductor" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.893656 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.896860 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.905460 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.915962 4764 scope.go:117] "RemoveContainer" containerID="fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.916669 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a\": container with ID starting with fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a not found: ID does not exist" containerID="fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.916716 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a"} err="failed to get container status \"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a\": rpc error: code = NotFound desc = could not find container \"fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a\": container with ID starting with fd7fa57358042e260a418c46a2d60e6987c9f404b69b56f5b5bfd22b0ce1ca4a not found: ID does not exist" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.916794 4764 scope.go:117] "RemoveContainer" containerID="4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.917741 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2\": container with ID starting with 4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2 not found: ID does not exist" containerID="4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.917786 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2"} err="failed to get container status \"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2\": rpc error: code = NotFound desc = could not find container \"4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2\": container with ID starting with 4c80423bd0ba691d48a14ecac55347f6b364f95faa37960b93285e2bdfe535b2 not found: ID does not exist" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.917818 4764 scope.go:117] "RemoveContainer" containerID="f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.922196 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.937660 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.945506 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.946125 4764 scope.go:117] "RemoveContainer" containerID="cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.947249 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.950491 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.957706 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.974137 4764 scope.go:117] "RemoveContainer" containerID="f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.974492 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081\": container with ID starting with f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081 not found: ID does not exist" containerID="f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.974516 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081"} err="failed to get container status \"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081\": rpc error: code = NotFound desc = could not find container \"f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081\": container with ID starting with f9821e73920be8a2583b274a9c1ee085a8489c3ff7c2d74d8c8a5e10dfb32081 not found: ID does not exist" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.974536 4764 scope.go:117] "RemoveContainer" containerID="cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974" Dec 04 01:16:16 crc kubenswrapper[4764]: E1204 01:16:16.974832 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974\": container with ID starting with cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974 not found: ID does not exist" containerID="cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.974913 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974"} err="failed to get container status \"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974\": rpc error: code = NotFound desc = could not find container \"cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974\": container with ID starting with cb85a9d0ebef97a5179d917464b1a3ac3a38cd6cedc4c080aa6bddc8d1901974 not found: ID does not exist" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989104 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989185 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989285 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dt2d\" (UniqueName: \"kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989310 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:16 crc kubenswrapper[4764]: I1204 01:16:16.989368 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mllnf\" (UniqueName: \"kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.075440 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.090039 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091691 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091790 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091826 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091873 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091919 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dt2d\" (UniqueName: \"kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091950 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.091998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.092026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mllnf\" (UniqueName: \"kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.092188 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.092362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.101375 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.103048 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.105506 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.107031 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.109058 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.112894 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.134365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.136475 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.144489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dt2d\" (UniqueName: \"kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d\") pod \"nova-metadata-0\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.146154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mllnf\" (UniqueName: \"kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf\") pod \"nova-api-0\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.193681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.193814 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b4bm\" (UniqueName: \"kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.193907 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: E1204 01:16:17.195292 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 01:16:17 crc kubenswrapper[4764]: E1204 01:16:17.196739 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 01:16:17 crc kubenswrapper[4764]: E1204 01:16:17.198179 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 01:16:17 crc kubenswrapper[4764]: E1204 01:16:17.198245 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerName="nova-scheduler-scheduler" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.219059 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.291863 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.295683 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.295778 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b4bm\" (UniqueName: \"kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.295824 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.299840 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.301966 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.328808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b4bm\" (UniqueName: \"kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm\") pod \"nova-cell0-conductor-0\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.469738 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: W1204 01:16:17.737247 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a62a7a6_2d10_4577_9c35_dc3ecc032237.slice/crio-1cb1573f63a32bc60fc2d422c8ab33526ccb30c4a9b13840d6665a0169d7da17 WatchSource:0}: Error finding container 1cb1573f63a32bc60fc2d422c8ab33526ccb30c4a9b13840d6665a0169d7da17: Status 404 returned error can't find the container with id 1cb1573f63a32bc60fc2d422c8ab33526ccb30c4a9b13840d6665a0169d7da17 Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.758837 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.767957 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerStarted","Data":"1cb1573f63a32bc60fc2d422c8ab33526ccb30c4a9b13840d6665a0169d7da17"} Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.770457 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85b4c080-751d-41bd-8c06-3982fe210fc5","Type":"ContainerStarted","Data":"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349"} Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.770488 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85b4c080-751d-41bd-8c06-3982fe210fc5","Type":"ContainerStarted","Data":"545e220b98e947dc1a29d784dfe57a7b04f5680f4e3772195816af9761ecc2dd"} Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.772303 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.795342 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.795322824 podStartE2EDuration="2.795322824s" podCreationTimestamp="2025-12-04 01:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:17.79313526 +0000 UTC m=+5713.554459671" watchObservedRunningTime="2025-12-04 01:16:17.795322824 +0000 UTC m=+5713.556647235" Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.839660 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: I1204 01:16:17.935553 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 01:16:17 crc kubenswrapper[4764]: W1204 01:16:17.938138 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7df9bb94_13e8_4250_8538_f55f68e1d29a.slice/crio-a7ca49fde1d08ea109aef0d1a932d4a85e77a8c83440118df0ce0c7944d2d09c WatchSource:0}: Error finding container a7ca49fde1d08ea109aef0d1a932d4a85e77a8c83440118df0ce0c7944d2d09c: Status 404 returned error can't find the container with id a7ca49fde1d08ea109aef0d1a932d4a85e77a8c83440118df0ce0c7944d2d09c Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.561234 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bd46f9-c2db-40fb-a8bd-bb922f14fef4" path="/var/lib/kubelet/pods/73bd46f9-c2db-40fb-a8bd-bb922f14fef4/volumes" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.562661 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74566398-f898-4c72-bfbf-b77bdf395b33" path="/var/lib/kubelet/pods/74566398-f898-4c72-bfbf-b77bdf395b33/volumes" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.563547 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f76ede-6459-4fb5-96e6-e4ce0870a2f9" path="/var/lib/kubelet/pods/c0f76ede-6459-4fb5-96e6-e4ce0870a2f9/volumes" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.780437 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerStarted","Data":"6c943e4231f493c8dc35fdfb95b9753bb224e0fc35cc27d4ee80ba8bf459da59"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.780485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerStarted","Data":"1363f1b0274e1dc39fa5a3395613029631387b7821bc5c5ae39e647dd1436ff4"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.780495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerStarted","Data":"a53a7cc2b7bf2d7ebc8df087ea3b8dab5f4725270c579e624d48ec0efe96afce"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.791425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7df9bb94-13e8-4250-8538-f55f68e1d29a","Type":"ContainerStarted","Data":"aa1748e5f32efac2776cd6a627bca59715d373132528426aee5f82631a52ed7e"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.791463 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7df9bb94-13e8-4250-8538-f55f68e1d29a","Type":"ContainerStarted","Data":"a7ca49fde1d08ea109aef0d1a932d4a85e77a8c83440118df0ce0c7944d2d09c"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.791570 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.793118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerStarted","Data":"151e508b584402b282c414977cdbc9ef76da2ee3166f08c5ee1d0437cda46176"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.793167 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerStarted","Data":"ca95ba055a32a0d428208d646911fb0196920bf9994bf72f74317d6dbe7568e6"} Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.825206 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8251812259999998 podStartE2EDuration="2.825181226s" podCreationTimestamp="2025-12-04 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:18.808824583 +0000 UTC m=+5714.570149004" watchObservedRunningTime="2025-12-04 01:16:18.825181226 +0000 UTC m=+5714.586505667" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.841147 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.841115108 podStartE2EDuration="1.841115108s" podCreationTimestamp="2025-12-04 01:16:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:18.833157052 +0000 UTC m=+5714.594481463" watchObservedRunningTime="2025-12-04 01:16:18.841115108 +0000 UTC m=+5714.602439559" Dec 04 01:16:18 crc kubenswrapper[4764]: I1204 01:16:18.867338 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.867317583 podStartE2EDuration="2.867317583s" podCreationTimestamp="2025-12-04 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:18.857887181 +0000 UTC m=+5714.619211642" watchObservedRunningTime="2025-12-04 01:16:18.867317583 +0000 UTC m=+5714.628642004" Dec 04 01:16:19 crc kubenswrapper[4764]: I1204 01:16:19.427521 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.340150 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.488385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle\") pod \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.488520 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqmhf\" (UniqueName: \"kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf\") pod \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.488651 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data\") pod \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\" (UID: \"8bdb5ccb-650f-4264-810f-0a3ad037f59e\") " Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.493657 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf" (OuterVolumeSpecName: "kube-api-access-kqmhf") pod "8bdb5ccb-650f-4264-810f-0a3ad037f59e" (UID: "8bdb5ccb-650f-4264-810f-0a3ad037f59e"). InnerVolumeSpecName "kube-api-access-kqmhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.525882 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data" (OuterVolumeSpecName: "config-data") pod "8bdb5ccb-650f-4264-810f-0a3ad037f59e" (UID: "8bdb5ccb-650f-4264-810f-0a3ad037f59e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.528543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bdb5ccb-650f-4264-810f-0a3ad037f59e" (UID: "8bdb5ccb-650f-4264-810f-0a3ad037f59e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.590635 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqmhf\" (UniqueName: \"kubernetes.io/projected/8bdb5ccb-650f-4264-810f-0a3ad037f59e-kube-api-access-kqmhf\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.590664 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.590673 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bdb5ccb-650f-4264-810f-0a3ad037f59e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.827486 4764 generic.go:334] "Generic (PLEG): container finished" podID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" exitCode=0 Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.827544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bdb5ccb-650f-4264-810f-0a3ad037f59e","Type":"ContainerDied","Data":"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c"} Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.827590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8bdb5ccb-650f-4264-810f-0a3ad037f59e","Type":"ContainerDied","Data":"4a05aa02c6f8df427720a5c8daf71fff7a5f5585f421f94ca737a1d367fac159"} Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.827616 4764 scope.go:117] "RemoveContainer" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.828165 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.854826 4764 scope.go:117] "RemoveContainer" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" Dec 04 01:16:21 crc kubenswrapper[4764]: E1204 01:16:21.855400 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c\": container with ID starting with 92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c not found: ID does not exist" containerID="92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.855443 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c"} err="failed to get container status \"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c\": rpc error: code = NotFound desc = could not find container \"92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c\": container with ID starting with 92c767cb469f35817ba28600642f4aeb5541141f335ab0a685973c400546672c not found: ID does not exist" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.874371 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.891844 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.906336 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:21 crc kubenswrapper[4764]: E1204 01:16:21.906962 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerName="nova-scheduler-scheduler" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.906989 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerName="nova-scheduler-scheduler" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.907299 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" containerName="nova-scheduler-scheduler" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.908310 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.915206 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.916296 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.997943 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr6kv\" (UniqueName: \"kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.998126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:21 crc kubenswrapper[4764]: I1204 01:16:21.998172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.100347 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr6kv\" (UniqueName: \"kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.100478 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.100511 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.104760 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.111378 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.121514 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr6kv\" (UniqueName: \"kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv\") pod \"nova-scheduler-0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.219926 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.220210 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.237944 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.556558 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdb5ccb-650f-4264-810f-0a3ad037f59e" path="/var/lib/kubelet/pods/8bdb5ccb-650f-4264-810f-0a3ad037f59e/volumes" Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.661298 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 01:16:22 crc kubenswrapper[4764]: W1204 01:16:22.662315 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod060031f0_ca89_4cb4_82db_2b095eb44cf0.slice/crio-06baab70f8c4d8028d90a04d424c769470d748dc9bf80acf444cb4a7b635fd8b WatchSource:0}: Error finding container 06baab70f8c4d8028d90a04d424c769470d748dc9bf80acf444cb4a7b635fd8b: Status 404 returned error can't find the container with id 06baab70f8c4d8028d90a04d424c769470d748dc9bf80acf444cb4a7b635fd8b Dec 04 01:16:22 crc kubenswrapper[4764]: I1204 01:16:22.845265 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"060031f0-ca89-4cb4-82db-2b095eb44cf0","Type":"ContainerStarted","Data":"06baab70f8c4d8028d90a04d424c769470d748dc9bf80acf444cb4a7b635fd8b"} Dec 04 01:16:23 crc kubenswrapper[4764]: I1204 01:16:23.858444 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"060031f0-ca89-4cb4-82db-2b095eb44cf0","Type":"ContainerStarted","Data":"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7"} Dec 04 01:16:23 crc kubenswrapper[4764]: I1204 01:16:23.877466 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.877449024 podStartE2EDuration="2.877449024s" podCreationTimestamp="2025-12-04 01:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:23.872212905 +0000 UTC m=+5719.633537316" watchObservedRunningTime="2025-12-04 01:16:23.877449024 +0000 UTC m=+5719.638773435" Dec 04 01:16:24 crc kubenswrapper[4764]: I1204 01:16:24.427222 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:24 crc kubenswrapper[4764]: I1204 01:16:24.444411 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:24 crc kubenswrapper[4764]: I1204 01:16:24.876245 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 01:16:26 crc kubenswrapper[4764]: I1204 01:16:26.173627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.219971 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.220337 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.239141 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.292996 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.293056 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 01:16:27 crc kubenswrapper[4764]: I1204 01:16:27.509492 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 01:16:28 crc kubenswrapper[4764]: I1204 01:16:28.301951 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:16:28 crc kubenswrapper[4764]: I1204 01:16:28.302642 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:16:28 crc kubenswrapper[4764]: I1204 01:16:28.387070 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:16:28 crc kubenswrapper[4764]: I1204 01:16:28.387080 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.84:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.631483 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.633602 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.635464 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.660846 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.701512 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.701924 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.701970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82mc5\" (UniqueName: \"kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.702105 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.702125 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.702172 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.803850 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.804254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.804456 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.804632 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.804756 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.805192 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.805389 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82mc5\" (UniqueName: \"kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.810187 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.810500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.810921 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.812078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.823213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82mc5\" (UniqueName: \"kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5\") pod \"cinder-scheduler-0\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:31 crc kubenswrapper[4764]: I1204 01:16:31.954051 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.239217 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.272317 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.465499 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:32 crc kubenswrapper[4764]: W1204 01:16:32.475803 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88af1972_91f6_4955_a7ab_b3cbe3f56109.slice/crio-b73ba640bbc9f6b15b5098ade1a2cfde26ffcc51855e07e7310119e72c30bcb4 WatchSource:0}: Error finding container b73ba640bbc9f6b15b5098ade1a2cfde26ffcc51855e07e7310119e72c30bcb4: Status 404 returned error can't find the container with id b73ba640bbc9f6b15b5098ade1a2cfde26ffcc51855e07e7310119e72c30bcb4 Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.947820 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerStarted","Data":"b73ba640bbc9f6b15b5098ade1a2cfde26ffcc51855e07e7310119e72c30bcb4"} Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.961400 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.962030 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api" containerID="cri-o://35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3" gracePeriod=30 Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.961704 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api-log" containerID="cri-o://fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b" gracePeriod=30 Dec 04 01:16:32 crc kubenswrapper[4764]: I1204 01:16:32.996475 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.585642 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.587879 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.594737 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.737760 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xct8p\" (UniqueName: \"kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.737863 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.737933 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.830664 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.832295 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.834018 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.839320 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xct8p\" (UniqueName: \"kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.839409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.839451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.839898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.839943 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.848747 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.859211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xct8p\" (UniqueName: \"kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p\") pod \"redhat-marketplace-njg4h\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941250 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941335 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-run\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941477 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941632 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941774 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941806 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941841 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdcml\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-kube-api-access-mdcml\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941879 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.941978 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.942087 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.942280 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.966611 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.981055 4764 generic.go:334] "Generic (PLEG): container finished" podID="73ded4db-32b3-401a-9c1c-4751895ce624" containerID="fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b" exitCode=143 Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.981133 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerDied","Data":"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b"} Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.985517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerStarted","Data":"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e"} Dec 04 01:16:33 crc kubenswrapper[4764]: I1204 01:16:33.985544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerStarted","Data":"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529"} Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.044886 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.044951 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045147 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdcml\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-kube-api-access-mdcml\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045280 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045329 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045409 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045665 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045858 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-run\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.046020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.045174 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050103 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050365 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-run\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050397 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-sys\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050500 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050584 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050892 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.050924 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/81e8ae86-9710-408e-83e0-8c50d1c53ce1-dev\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.052003 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.054262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.059963 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.064557 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81e8ae86-9710-408e-83e0-8c50d1c53ce1-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.068238 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdcml\" (UniqueName: \"kubernetes.io/projected/81e8ae86-9710-408e-83e0-8c50d1c53ce1-kube-api-access-mdcml\") pod \"cinder-volume-volume1-0\" (UID: \"81e8ae86-9710-408e-83e0-8c50d1c53ce1\") " pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.145998 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.508457 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.508440631 podStartE2EDuration="3.508440631s" podCreationTimestamp="2025-12-04 01:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:34.026120328 +0000 UTC m=+5729.787444739" watchObservedRunningTime="2025-12-04 01:16:34.508440631 +0000 UTC m=+5730.269765042" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.512004 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:34 crc kubenswrapper[4764]: W1204 01:16:34.516057 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod267f2443_270c_42c6_8d75_54a65cd8637c.slice/crio-71f8bdc6bbca07b3dfc6aef199cd09eeadb7e22271ae76633958745952a943bc WatchSource:0}: Error finding container 71f8bdc6bbca07b3dfc6aef199cd09eeadb7e22271ae76633958745952a943bc: Status 404 returned error can't find the container with id 71f8bdc6bbca07b3dfc6aef199cd09eeadb7e22271ae76633958745952a943bc Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.542690 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.544258 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.551249 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558630 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558680 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558712 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-scripts\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558833 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-lib-modules\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558896 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558916 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-dev\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558940 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-run\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.558962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559020 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-sys\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559073 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559098 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csrpf\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-kube-api-access-csrpf\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.559157 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-ceph\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.566049 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661203 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-dev\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661273 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-run\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661291 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661341 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-sys\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661416 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csrpf\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-kube-api-access-csrpf\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-ceph\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661481 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661503 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661542 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-scripts\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-lib-modules\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661729 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661765 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-dev\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.661803 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-run\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.662052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-nvme\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.662117 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.662140 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-sys\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.662171 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.662427 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.663995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-lib-modules\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.664019 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/04f3ee5a-109b-49df-9264-c6bb556af4bc-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.666982 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-ceph\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.667545 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.668179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.668548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-scripts\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.678122 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04f3ee5a-109b-49df-9264-c6bb556af4bc-config-data-custom\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.681766 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csrpf\" (UniqueName: \"kubernetes.io/projected/04f3ee5a-109b-49df-9264-c6bb556af4bc-kube-api-access-csrpf\") pod \"cinder-backup-0\" (UID: \"04f3ee5a-109b-49df-9264-c6bb556af4bc\") " pod="openstack/cinder-backup-0" Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.719769 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 04 01:16:34 crc kubenswrapper[4764]: W1204 01:16:34.791384 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e8ae86_9710_408e_83e0_8c50d1c53ce1.slice/crio-08c690544bcf4e96b5f13131e686c82d4b666faa04a5de90a243b819ebbf1408 WatchSource:0}: Error finding container 08c690544bcf4e96b5f13131e686c82d4b666faa04a5de90a243b819ebbf1408: Status 404 returned error can't find the container with id 08c690544bcf4e96b5f13131e686c82d4b666faa04a5de90a243b819ebbf1408 Dec 04 01:16:34 crc kubenswrapper[4764]: I1204 01:16:34.864211 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 01:16:35 crc kubenswrapper[4764]: I1204 01:16:35.007120 4764 generic.go:334] "Generic (PLEG): container finished" podID="267f2443-270c-42c6-8d75-54a65cd8637c" containerID="9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3" exitCode=0 Dec 04 01:16:35 crc kubenswrapper[4764]: I1204 01:16:35.007433 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerDied","Data":"9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3"} Dec 04 01:16:35 crc kubenswrapper[4764]: I1204 01:16:35.007464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerStarted","Data":"71f8bdc6bbca07b3dfc6aef199cd09eeadb7e22271ae76633958745952a943bc"} Dec 04 01:16:35 crc kubenswrapper[4764]: I1204 01:16:35.011563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81e8ae86-9710-408e-83e0-8c50d1c53ce1","Type":"ContainerStarted","Data":"08c690544bcf4e96b5f13131e686c82d4b666faa04a5de90a243b819ebbf1408"} Dec 04 01:16:35 crc kubenswrapper[4764]: I1204 01:16:35.460793 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 01:16:35 crc kubenswrapper[4764]: W1204 01:16:35.519351 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04f3ee5a_109b_49df_9264_c6bb556af4bc.slice/crio-beb653160be57c5fe76a4c783619dc5e947eea3d51c04d6eee6b608f4aa17f9a WatchSource:0}: Error finding container beb653160be57c5fe76a4c783619dc5e947eea3d51c04d6eee6b608f4aa17f9a: Status 404 returned error can't find the container with id beb653160be57c5fe76a4c783619dc5e947eea3d51c04d6eee6b608f4aa17f9a Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.021997 4764 generic.go:334] "Generic (PLEG): container finished" podID="267f2443-270c-42c6-8d75-54a65cd8637c" containerID="69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e" exitCode=0 Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.022095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerDied","Data":"69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e"} Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.024582 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"04f3ee5a-109b-49df-9264-c6bb556af4bc","Type":"ContainerStarted","Data":"beb653160be57c5fe76a4c783619dc5e947eea3d51c04d6eee6b608f4aa17f9a"} Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.027681 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81e8ae86-9710-408e-83e0-8c50d1c53ce1","Type":"ContainerStarted","Data":"615416714a1315c62ae894aa73d6ca10ba163ac248aef56829dea96ed9a4d3a9"} Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.202293 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.80:8776/healthcheck\": dial tcp 10.217.1.80:8776: connect: connection refused" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.621988 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804857 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804881 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804920 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804948 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgmhz\" (UniqueName: \"kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.804966 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.805058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle\") pod \"73ded4db-32b3-401a-9c1c-4751895ce624\" (UID: \"73ded4db-32b3-401a-9c1c-4751895ce624\") " Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.806183 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs" (OuterVolumeSpecName: "logs") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.806856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.809181 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz" (OuterVolumeSpecName: "kube-api-access-wgmhz") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "kube-api-access-wgmhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.813877 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.814005 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts" (OuterVolumeSpecName: "scripts") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.861504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.901011 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data" (OuterVolumeSpecName: "config-data") pod "73ded4db-32b3-401a-9c1c-4751895ce624" (UID: "73ded4db-32b3-401a-9c1c-4751895ce624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907516 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907539 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/73ded4db-32b3-401a-9c1c-4751895ce624-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907550 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907559 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/73ded4db-32b3-401a-9c1c-4751895ce624-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907568 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgmhz\" (UniqueName: \"kubernetes.io/projected/73ded4db-32b3-401a-9c1c-4751895ce624-kube-api-access-wgmhz\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907578 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.907586 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73ded4db-32b3-401a-9c1c-4751895ce624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:36 crc kubenswrapper[4764]: I1204 01:16:36.955905 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.038952 4764 generic.go:334] "Generic (PLEG): container finished" podID="73ded4db-32b3-401a-9c1c-4751895ce624" containerID="35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3" exitCode=0 Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.039022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerDied","Data":"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.039054 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"73ded4db-32b3-401a-9c1c-4751895ce624","Type":"ContainerDied","Data":"644cb85a54772dae8cbbe4857a8baa5fa821cea3717d51c4ddf7fd88dc6cdb02"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.039073 4764 scope.go:117] "RemoveContainer" containerID="35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.039216 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.044750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerStarted","Data":"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.051179 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"04f3ee5a-109b-49df-9264-c6bb556af4bc","Type":"ContainerStarted","Data":"4ad32c3080cb673c47642b12a107621d14871b0c5db1326c26d0185ac53bcca3"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.051227 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"04f3ee5a-109b-49df-9264-c6bb556af4bc","Type":"ContainerStarted","Data":"f9fc5b4a989f2a648ce7e2d1031e867567febfffddadb1995be18d53d3ae9387"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.056522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"81e8ae86-9710-408e-83e0-8c50d1c53ce1","Type":"ContainerStarted","Data":"334e598bbc8bfed83a91cc85e1f0afb63339b82558940d2d418917ec0a68534a"} Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.070038 4764 scope.go:117] "RemoveContainer" containerID="fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.097243 4764 scope.go:117] "RemoveContainer" containerID="35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.099119 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-njg4h" podStartSLOduration=2.648687198 podStartE2EDuration="4.099101423s" podCreationTimestamp="2025-12-04 01:16:33 +0000 UTC" firstStartedPulling="2025-12-04 01:16:35.009640067 +0000 UTC m=+5730.770964488" lastFinishedPulling="2025-12-04 01:16:36.460054302 +0000 UTC m=+5732.221378713" observedRunningTime="2025-12-04 01:16:37.070607701 +0000 UTC m=+5732.831932122" watchObservedRunningTime="2025-12-04 01:16:37.099101423 +0000 UTC m=+5732.860425834" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.099649 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.2863456859999998 podStartE2EDuration="3.099644866s" podCreationTimestamp="2025-12-04 01:16:34 +0000 UTC" firstStartedPulling="2025-12-04 01:16:35.521776785 +0000 UTC m=+5731.283101196" lastFinishedPulling="2025-12-04 01:16:36.335075965 +0000 UTC m=+5732.096400376" observedRunningTime="2025-12-04 01:16:37.09450459 +0000 UTC m=+5732.855829011" watchObservedRunningTime="2025-12-04 01:16:37.099644866 +0000 UTC m=+5732.860969277" Dec 04 01:16:37 crc kubenswrapper[4764]: E1204 01:16:37.100631 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3\": container with ID starting with 35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3 not found: ID does not exist" containerID="35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.100669 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3"} err="failed to get container status \"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3\": rpc error: code = NotFound desc = could not find container \"35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3\": container with ID starting with 35fea39abaed4c8ddd30481d28f90decc72c0c5038e756d07fb99eb69cda44e3 not found: ID does not exist" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.100694 4764 scope.go:117] "RemoveContainer" containerID="fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b" Dec 04 01:16:37 crc kubenswrapper[4764]: E1204 01:16:37.101349 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b\": container with ID starting with fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b not found: ID does not exist" containerID="fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.101395 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b"} err="failed to get container status \"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b\": rpc error: code = NotFound desc = could not find container \"fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b\": container with ID starting with fd1335cadd81b1f7518dc99110e7174127dcd34bbacb7a306a62a4f83976233b not found: ID does not exist" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.121561 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.357446596 podStartE2EDuration="4.118709546s" podCreationTimestamp="2025-12-04 01:16:33 +0000 UTC" firstStartedPulling="2025-12-04 01:16:34.793907907 +0000 UTC m=+5730.555232318" lastFinishedPulling="2025-12-04 01:16:35.555170857 +0000 UTC m=+5731.316495268" observedRunningTime="2025-12-04 01:16:37.115278711 +0000 UTC m=+5732.876603142" watchObservedRunningTime="2025-12-04 01:16:37.118709546 +0000 UTC m=+5732.880033957" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.170473 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.183882 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.194581 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:37 crc kubenswrapper[4764]: E1204 01:16:37.195146 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api-log" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.195164 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api-log" Dec 04 01:16:37 crc kubenswrapper[4764]: E1204 01:16:37.195181 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.195188 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.195423 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api-log" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.195434 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" containerName="cinder-api" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.196631 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.201939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.202677 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.228165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.229027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.238934 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.295137 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.295712 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.296061 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.301859 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320315 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/718b4d98-d5da-4f9b-802c-b00afbdd9593-etc-machine-id\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320384 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data-custom\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/718b4d98-d5da-4f9b-802c-b00afbdd9593-logs\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320463 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-scripts\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320515 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.320611 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8m2\" (UniqueName: \"kubernetes.io/projected/718b4d98-d5da-4f9b-802c-b00afbdd9593-kube-api-access-xg8m2\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.422664 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data-custom\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.423102 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/718b4d98-d5da-4f9b-802c-b00afbdd9593-logs\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.423471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/718b4d98-d5da-4f9b-802c-b00afbdd9593-logs\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.423896 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-scripts\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.424401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.424451 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8m2\" (UniqueName: \"kubernetes.io/projected/718b4d98-d5da-4f9b-802c-b00afbdd9593-kube-api-access-xg8m2\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.424940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.425121 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/718b4d98-d5da-4f9b-802c-b00afbdd9593-etc-machine-id\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.426953 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/718b4d98-d5da-4f9b-802c-b00afbdd9593-etc-machine-id\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.428240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data-custom\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.428646 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-scripts\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.428755 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.430048 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718b4d98-d5da-4f9b-802c-b00afbdd9593-config-data\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.441308 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8m2\" (UniqueName: \"kubernetes.io/projected/718b4d98-d5da-4f9b-802c-b00afbdd9593-kube-api-access-xg8m2\") pod \"cinder-api-0\" (UID: \"718b4d98-d5da-4f9b-802c-b00afbdd9593\") " pod="openstack/cinder-api-0" Dec 04 01:16:37 crc kubenswrapper[4764]: I1204 01:16:37.521844 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.003374 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.069786 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"718b4d98-d5da-4f9b-802c-b00afbdd9593","Type":"ContainerStarted","Data":"6cda6ee02c6563a14353dafda197bb5a6c61f50bfbdd5c2ab447bdff72915d0b"} Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.070971 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.074153 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.079768 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 01:16:38 crc kubenswrapper[4764]: I1204 01:16:38.556804 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ded4db-32b3-401a-9c1c-4751895ce624" path="/var/lib/kubelet/pods/73ded4db-32b3-401a-9c1c-4751895ce624/volumes" Dec 04 01:16:39 crc kubenswrapper[4764]: I1204 01:16:39.094929 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"718b4d98-d5da-4f9b-802c-b00afbdd9593","Type":"ContainerStarted","Data":"2a2ff2d350af7c5ac364317d76190582c67a101cfc3233fcf8cdc8b5ae7eacb9"} Dec 04 01:16:39 crc kubenswrapper[4764]: I1204 01:16:39.146325 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:39 crc kubenswrapper[4764]: I1204 01:16:39.864391 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 01:16:40 crc kubenswrapper[4764]: I1204 01:16:40.107288 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"718b4d98-d5da-4f9b-802c-b00afbdd9593","Type":"ContainerStarted","Data":"3cfb815d2a1d6244a24789d1e6bf8902909155b8f8f9b9468349d6c9c06ff70d"} Dec 04 01:16:40 crc kubenswrapper[4764]: I1204 01:16:40.130707 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.130688529 podStartE2EDuration="3.130688529s" podCreationTimestamp="2025-12-04 01:16:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:40.126043925 +0000 UTC m=+5735.887368346" watchObservedRunningTime="2025-12-04 01:16:40.130688529 +0000 UTC m=+5735.892012940" Dec 04 01:16:41 crc kubenswrapper[4764]: I1204 01:16:41.121593 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 01:16:42 crc kubenswrapper[4764]: I1204 01:16:42.176016 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 01:16:42 crc kubenswrapper[4764]: I1204 01:16:42.219740 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:43 crc kubenswrapper[4764]: I1204 01:16:43.142574 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="cinder-scheduler" containerID="cri-o://740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529" gracePeriod=30 Dec 04 01:16:43 crc kubenswrapper[4764]: I1204 01:16:43.143215 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="probe" containerID="cri-o://47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e" gracePeriod=30 Dec 04 01:16:43 crc kubenswrapper[4764]: I1204 01:16:43.967642 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:43 crc kubenswrapper[4764]: I1204 01:16:43.968898 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.051792 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.160357 4764 generic.go:334] "Generic (PLEG): container finished" podID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerID="47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e" exitCode=0 Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.160459 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerDied","Data":"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e"} Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.231482 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.315874 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:44 crc kubenswrapper[4764]: I1204 01:16:44.414329 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.093358 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.093458 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.179464 4764 generic.go:334] "Generic (PLEG): container finished" podID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerID="740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529" exitCode=0 Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.179504 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.179546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerDied","Data":"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529"} Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.180872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88af1972-91f6-4955-a7ab-b3cbe3f56109","Type":"ContainerDied","Data":"b73ba640bbc9f6b15b5098ade1a2cfde26ffcc51855e07e7310119e72c30bcb4"} Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.180892 4764 scope.go:117] "RemoveContainer" containerID="47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198610 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198737 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198803 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.198846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82mc5\" (UniqueName: \"kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.199348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data\") pod \"88af1972-91f6-4955-a7ab-b3cbe3f56109\" (UID: \"88af1972-91f6-4955-a7ab-b3cbe3f56109\") " Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.199794 4764 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88af1972-91f6-4955-a7ab-b3cbe3f56109-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.205145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts" (OuterVolumeSpecName: "scripts") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.205169 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5" (OuterVolumeSpecName: "kube-api-access-82mc5") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "kube-api-access-82mc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.205898 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.237707 4764 scope.go:117] "RemoveContainer" containerID="740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.265473 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.301277 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.301311 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.301325 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.301339 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82mc5\" (UniqueName: \"kubernetes.io/projected/88af1972-91f6-4955-a7ab-b3cbe3f56109-kube-api-access-82mc5\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.305166 4764 scope.go:117] "RemoveContainer" containerID="47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e" Dec 04 01:16:45 crc kubenswrapper[4764]: E1204 01:16:45.305573 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e\": container with ID starting with 47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e not found: ID does not exist" containerID="47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.305595 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e"} err="failed to get container status \"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e\": rpc error: code = NotFound desc = could not find container \"47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e\": container with ID starting with 47d83077d75e59672f2cd53655fffab555a60ae387cf22962cc04a743d67239e not found: ID does not exist" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.305618 4764 scope.go:117] "RemoveContainer" containerID="740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529" Dec 04 01:16:45 crc kubenswrapper[4764]: E1204 01:16:45.305889 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529\": container with ID starting with 740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529 not found: ID does not exist" containerID="740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.305910 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529"} err="failed to get container status \"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529\": rpc error: code = NotFound desc = could not find container \"740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529\": container with ID starting with 740f1c020afb2399ee449b52fd1709eb5d071a12403999ee465fbcd34d923529 not found: ID does not exist" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.324979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data" (OuterVolumeSpecName: "config-data") pod "88af1972-91f6-4955-a7ab-b3cbe3f56109" (UID: "88af1972-91f6-4955-a7ab-b3cbe3f56109"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.402979 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88af1972-91f6-4955-a7ab-b3cbe3f56109-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.511012 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.518850 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.539149 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:45 crc kubenswrapper[4764]: E1204 01:16:45.539543 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="probe" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.539559 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="probe" Dec 04 01:16:45 crc kubenswrapper[4764]: E1204 01:16:45.539576 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="cinder-scheduler" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.539583 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="cinder-scheduler" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.539784 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="cinder-scheduler" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.539806 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" containerName="probe" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.540776 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.545152 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.567894 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.709777 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.709899 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.710060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbmsz\" (UniqueName: \"kubernetes.io/projected/fb066f4d-c699-4548-8b39-85cc5a83211c-kube-api-access-vbmsz\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.710241 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.710451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb066f4d-c699-4548-8b39-85cc5a83211c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.710603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812323 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812414 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812514 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbmsz\" (UniqueName: \"kubernetes.io/projected/fb066f4d-c699-4548-8b39-85cc5a83211c-kube-api-access-vbmsz\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812641 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb066f4d-c699-4548-8b39-85cc5a83211c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.812780 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb066f4d-c699-4548-8b39-85cc5a83211c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.817204 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-scripts\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.817240 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.817359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.823861 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb066f4d-c699-4548-8b39-85cc5a83211c-config-data\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.835888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbmsz\" (UniqueName: \"kubernetes.io/projected/fb066f4d-c699-4548-8b39-85cc5a83211c-kube-api-access-vbmsz\") pod \"cinder-scheduler-0\" (UID: \"fb066f4d-c699-4548-8b39-85cc5a83211c\") " pod="openstack/cinder-scheduler-0" Dec 04 01:16:45 crc kubenswrapper[4764]: I1204 01:16:45.869047 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.201104 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-njg4h" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="registry-server" containerID="cri-o://987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9" gracePeriod=2 Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.390083 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 01:16:46 crc kubenswrapper[4764]: W1204 01:16:46.394011 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb066f4d_c699_4548_8b39_85cc5a83211c.slice/crio-15d49b182f1315e7a434cf3d3829bf881a6a24a6965317a8e2dd2c1d895c6ae9 WatchSource:0}: Error finding container 15d49b182f1315e7a434cf3d3829bf881a6a24a6965317a8e2dd2c1d895c6ae9: Status 404 returned error can't find the container with id 15d49b182f1315e7a434cf3d3829bf881a6a24a6965317a8e2dd2c1d895c6ae9 Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.558854 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88af1972-91f6-4955-a7ab-b3cbe3f56109" path="/var/lib/kubelet/pods/88af1972-91f6-4955-a7ab-b3cbe3f56109/volumes" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.648932 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.731274 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities\") pod \"267f2443-270c-42c6-8d75-54a65cd8637c\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.731404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content\") pod \"267f2443-270c-42c6-8d75-54a65cd8637c\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.731433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xct8p\" (UniqueName: \"kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p\") pod \"267f2443-270c-42c6-8d75-54a65cd8637c\" (UID: \"267f2443-270c-42c6-8d75-54a65cd8637c\") " Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.733978 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities" (OuterVolumeSpecName: "utilities") pod "267f2443-270c-42c6-8d75-54a65cd8637c" (UID: "267f2443-270c-42c6-8d75-54a65cd8637c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.734825 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p" (OuterVolumeSpecName: "kube-api-access-xct8p") pod "267f2443-270c-42c6-8d75-54a65cd8637c" (UID: "267f2443-270c-42c6-8d75-54a65cd8637c"). InnerVolumeSpecName "kube-api-access-xct8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.751312 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "267f2443-270c-42c6-8d75-54a65cd8637c" (UID: "267f2443-270c-42c6-8d75-54a65cd8637c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.833524 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.833843 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/267f2443-270c-42c6-8d75-54a65cd8637c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:46 crc kubenswrapper[4764]: I1204 01:16:46.833862 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xct8p\" (UniqueName: \"kubernetes.io/projected/267f2443-270c-42c6-8d75-54a65cd8637c-kube-api-access-xct8p\") on node \"crc\" DevicePath \"\"" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.215002 4764 generic.go:334] "Generic (PLEG): container finished" podID="267f2443-270c-42c6-8d75-54a65cd8637c" containerID="987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9" exitCode=0 Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.215092 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njg4h" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.215079 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerDied","Data":"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9"} Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.215523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njg4h" event={"ID":"267f2443-270c-42c6-8d75-54a65cd8637c","Type":"ContainerDied","Data":"71f8bdc6bbca07b3dfc6aef199cd09eeadb7e22271ae76633958745952a943bc"} Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.215553 4764 scope.go:117] "RemoveContainer" containerID="987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.217790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb066f4d-c699-4548-8b39-85cc5a83211c","Type":"ContainerStarted","Data":"cb70ac02154c3cb2ec482654a1d0d49e9a6e63443e00cb1c5febdee10af94773"} Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.217834 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb066f4d-c699-4548-8b39-85cc5a83211c","Type":"ContainerStarted","Data":"15d49b182f1315e7a434cf3d3829bf881a6a24a6965317a8e2dd2c1d895c6ae9"} Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.239925 4764 scope.go:117] "RemoveContainer" containerID="69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.262589 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.274151 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-njg4h"] Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.283951 4764 scope.go:117] "RemoveContainer" containerID="9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.305105 4764 scope.go:117] "RemoveContainer" containerID="987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9" Dec 04 01:16:47 crc kubenswrapper[4764]: E1204 01:16:47.306038 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9\": container with ID starting with 987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9 not found: ID does not exist" containerID="987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.306090 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9"} err="failed to get container status \"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9\": rpc error: code = NotFound desc = could not find container \"987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9\": container with ID starting with 987c3b9f046f9b83e9cb321e67e674781bebd4c5ddffa3e86f05fc9f7076e6a9 not found: ID does not exist" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.306126 4764 scope.go:117] "RemoveContainer" containerID="69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e" Dec 04 01:16:47 crc kubenswrapper[4764]: E1204 01:16:47.306699 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e\": container with ID starting with 69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e not found: ID does not exist" containerID="69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.306827 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e"} err="failed to get container status \"69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e\": rpc error: code = NotFound desc = could not find container \"69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e\": container with ID starting with 69a43e22f6ead2505a396253ca1c71d93283bcda2d7037871f428377e85f2a6e not found: ID does not exist" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.306914 4764 scope.go:117] "RemoveContainer" containerID="9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3" Dec 04 01:16:47 crc kubenswrapper[4764]: E1204 01:16:47.307329 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3\": container with ID starting with 9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3 not found: ID does not exist" containerID="9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3" Dec 04 01:16:47 crc kubenswrapper[4764]: I1204 01:16:47.307358 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3"} err="failed to get container status \"9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3\": rpc error: code = NotFound desc = could not find container \"9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3\": container with ID starting with 9704f76e09214eeed1942c75aff09171fa4bfe515d53b979fa90b236300c20c3 not found: ID does not exist" Dec 04 01:16:48 crc kubenswrapper[4764]: I1204 01:16:48.228507 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fb066f4d-c699-4548-8b39-85cc5a83211c","Type":"ContainerStarted","Data":"5a72e64c7c2592f7969a381c7f725a830fddc33fd4910ab9474be5aeb6a58b52"} Dec 04 01:16:48 crc kubenswrapper[4764]: I1204 01:16:48.252197 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.252175701 podStartE2EDuration="3.252175701s" podCreationTimestamp="2025-12-04 01:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:16:48.247538557 +0000 UTC m=+5744.008862988" watchObservedRunningTime="2025-12-04 01:16:48.252175701 +0000 UTC m=+5744.013500122" Dec 04 01:16:48 crc kubenswrapper[4764]: I1204 01:16:48.558803 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" path="/var/lib/kubelet/pods/267f2443-270c-42c6-8d75-54a65cd8637c/volumes" Dec 04 01:16:49 crc kubenswrapper[4764]: I1204 01:16:49.427883 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 01:16:50 crc kubenswrapper[4764]: I1204 01:16:50.868835 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:16:50 crc kubenswrapper[4764]: I1204 01:16:50.869203 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:16:50 crc kubenswrapper[4764]: I1204 01:16:50.869268 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 01:16:56 crc kubenswrapper[4764]: I1204 01:16:56.160190 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 01:17:20 crc kubenswrapper[4764]: I1204 01:17:20.868880 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:17:20 crc kubenswrapper[4764]: I1204 01:17:20.871338 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:17:50 crc kubenswrapper[4764]: I1204 01:17:50.869635 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:17:50 crc kubenswrapper[4764]: I1204 01:17:50.870679 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:17:50 crc kubenswrapper[4764]: I1204 01:17:50.870781 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:17:50 crc kubenswrapper[4764]: I1204 01:17:50.872055 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:17:50 crc kubenswrapper[4764]: I1204 01:17:50.872164 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274" gracePeriod=600 Dec 04 01:17:51 crc kubenswrapper[4764]: I1204 01:17:51.963006 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274" exitCode=0 Dec 04 01:17:51 crc kubenswrapper[4764]: I1204 01:17:51.963078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274"} Dec 04 01:17:51 crc kubenswrapper[4764]: I1204 01:17:51.963593 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3"} Dec 04 01:17:51 crc kubenswrapper[4764]: I1204 01:17:51.963620 4764 scope.go:117] "RemoveContainer" containerID="b494b084be29d8d3f2c13a88241e0ae54d55f5161b8ca3ac244dabb48134105d" Dec 04 01:18:12 crc kubenswrapper[4764]: I1204 01:18:12.033215 4764 scope.go:117] "RemoveContainer" containerID="d07bcc7f5d09efdcce95c3ac207c5406706f295647eedbe6721c166617cee1e9" Dec 04 01:18:12 crc kubenswrapper[4764]: I1204 01:18:12.058482 4764 scope.go:117] "RemoveContainer" containerID="391526f996b57838bdc9ff852aa537feedc02931bf5a91b341b965e68aca617a" Dec 04 01:18:18 crc kubenswrapper[4764]: E1204 01:18:18.389552 4764 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:51958->38.102.83.13:39483: write tcp 38.102.83.13:51958->38.102.83.13:39483: write: broken pipe Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.778052 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:20 crc kubenswrapper[4764]: E1204 01:18:20.779375 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="registry-server" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.779398 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="registry-server" Dec 04 01:18:20 crc kubenswrapper[4764]: E1204 01:18:20.779427 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="extract-utilities" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.779439 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="extract-utilities" Dec 04 01:18:20 crc kubenswrapper[4764]: E1204 01:18:20.779464 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="extract-content" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.779477 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="extract-content" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.779893 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="267f2443-270c-42c6-8d75-54a65cd8637c" containerName="registry-server" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.786466 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.809557 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.894915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.895004 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.895242 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvbd\" (UniqueName: \"kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.996997 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.997304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.997634 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvbd\" (UniqueName: \"kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.997810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:20 crc kubenswrapper[4764]: I1204 01:18:20.998007 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:21 crc kubenswrapper[4764]: I1204 01:18:21.018042 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvbd\" (UniqueName: \"kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd\") pod \"community-operators-npkgc\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:21 crc kubenswrapper[4764]: I1204 01:18:21.109524 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:21 crc kubenswrapper[4764]: I1204 01:18:21.659937 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:22 crc kubenswrapper[4764]: I1204 01:18:22.333393 4764 generic.go:334] "Generic (PLEG): container finished" podID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerID="6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8" exitCode=0 Dec 04 01:18:22 crc kubenswrapper[4764]: I1204 01:18:22.333669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerDied","Data":"6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8"} Dec 04 01:18:22 crc kubenswrapper[4764]: I1204 01:18:22.335336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerStarted","Data":"af7e13393605c0d62a5e9e915f3874b6aed2050342916bf450f603a8d3429d46"} Dec 04 01:18:24 crc kubenswrapper[4764]: I1204 01:18:24.359633 4764 generic.go:334] "Generic (PLEG): container finished" podID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerID="d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7" exitCode=0 Dec 04 01:18:24 crc kubenswrapper[4764]: I1204 01:18:24.360029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerDied","Data":"d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7"} Dec 04 01:18:25 crc kubenswrapper[4764]: I1204 01:18:25.369108 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerStarted","Data":"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc"} Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.110289 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.110972 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.179970 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.219849 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-npkgc" podStartSLOduration=8.521203257 podStartE2EDuration="11.219811908s" podCreationTimestamp="2025-12-04 01:18:20 +0000 UTC" firstStartedPulling="2025-12-04 01:18:22.335814343 +0000 UTC m=+5838.097138764" lastFinishedPulling="2025-12-04 01:18:25.034423004 +0000 UTC m=+5840.795747415" observedRunningTime="2025-12-04 01:18:25.39093404 +0000 UTC m=+5841.152258451" watchObservedRunningTime="2025-12-04 01:18:31.219811908 +0000 UTC m=+5846.981136389" Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.488402 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:31 crc kubenswrapper[4764]: I1204 01:18:31.552659 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.055388 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0756-account-create-update-2l687"] Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.063776 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-lblj5"] Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.072802 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0756-account-create-update-2l687"] Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.080884 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-lblj5"] Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.447154 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-npkgc" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="registry-server" containerID="cri-o://3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc" gracePeriod=2 Dec 04 01:18:33 crc kubenswrapper[4764]: I1204 01:18:33.960858 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.065080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities\") pod \"62faedf4-c67e-40de-b79d-072a9ff896c4\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.065363 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvbd\" (UniqueName: \"kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd\") pod \"62faedf4-c67e-40de-b79d-072a9ff896c4\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.065472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content\") pod \"62faedf4-c67e-40de-b79d-072a9ff896c4\" (UID: \"62faedf4-c67e-40de-b79d-072a9ff896c4\") " Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.066124 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities" (OuterVolumeSpecName: "utilities") pod "62faedf4-c67e-40de-b79d-072a9ff896c4" (UID: "62faedf4-c67e-40de-b79d-072a9ff896c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.080979 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd" (OuterVolumeSpecName: "kube-api-access-sqvbd") pod "62faedf4-c67e-40de-b79d-072a9ff896c4" (UID: "62faedf4-c67e-40de-b79d-072a9ff896c4"). InnerVolumeSpecName "kube-api-access-sqvbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.108606 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62faedf4-c67e-40de-b79d-072a9ff896c4" (UID: "62faedf4-c67e-40de-b79d-072a9ff896c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.168558 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvbd\" (UniqueName: \"kubernetes.io/projected/62faedf4-c67e-40de-b79d-072a9ff896c4-kube-api-access-sqvbd\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.168616 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.168643 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62faedf4-c67e-40de-b79d-072a9ff896c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.456250 4764 generic.go:334] "Generic (PLEG): container finished" podID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerID="3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc" exitCode=0 Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.456305 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-npkgc" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.456305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerDied","Data":"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc"} Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.456436 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-npkgc" event={"ID":"62faedf4-c67e-40de-b79d-072a9ff896c4","Type":"ContainerDied","Data":"af7e13393605c0d62a5e9e915f3874b6aed2050342916bf450f603a8d3429d46"} Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.456460 4764 scope.go:117] "RemoveContainer" containerID="3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.486015 4764 scope.go:117] "RemoveContainer" containerID="d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.490230 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.529652 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-npkgc"] Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.574976 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0263bc12-675e-4dbf-a401-6acda0d97f1c" path="/var/lib/kubelet/pods/0263bc12-675e-4dbf-a401-6acda0d97f1c/volumes" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.575845 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" path="/var/lib/kubelet/pods/62faedf4-c67e-40de-b79d-072a9ff896c4/volumes" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.576597 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1" path="/var/lib/kubelet/pods/e7bb7ec5-5f97-4a2f-9dca-20f8f38c48a1/volumes" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.584893 4764 scope.go:117] "RemoveContainer" containerID="6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.622920 4764 scope.go:117] "RemoveContainer" containerID="3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc" Dec 04 01:18:34 crc kubenswrapper[4764]: E1204 01:18:34.624945 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc\": container with ID starting with 3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc not found: ID does not exist" containerID="3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.624978 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc"} err="failed to get container status \"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc\": rpc error: code = NotFound desc = could not find container \"3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc\": container with ID starting with 3d5bfe954b6c8dce645f36e82eb62c937948cb97da25f843cb1b0424245eb8cc not found: ID does not exist" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.624999 4764 scope.go:117] "RemoveContainer" containerID="d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7" Dec 04 01:18:34 crc kubenswrapper[4764]: E1204 01:18:34.625204 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7\": container with ID starting with d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7 not found: ID does not exist" containerID="d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.625234 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7"} err="failed to get container status \"d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7\": rpc error: code = NotFound desc = could not find container \"d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7\": container with ID starting with d33fedef93ed873286279b74da489eb593b6cd0cdf098817f40bf0204b475bf7 not found: ID does not exist" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.625247 4764 scope.go:117] "RemoveContainer" containerID="6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8" Dec 04 01:18:34 crc kubenswrapper[4764]: E1204 01:18:34.625426 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8\": container with ID starting with 6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8 not found: ID does not exist" containerID="6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8" Dec 04 01:18:34 crc kubenswrapper[4764]: I1204 01:18:34.625453 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8"} err="failed to get container status \"6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8\": rpc error: code = NotFound desc = could not find container \"6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8\": container with ID starting with 6d3e4b404fcecd9be993806362d4a2722eb8f763e0beed3ca5e23b8f6bdd1bf8 not found: ID does not exist" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.091205 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k6vxt"] Dec 04 01:18:37 crc kubenswrapper[4764]: E1204 01:18:37.092167 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="registry-server" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.092184 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="registry-server" Dec 04 01:18:37 crc kubenswrapper[4764]: E1204 01:18:37.092216 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="extract-content" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.092224 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="extract-content" Dec 04 01:18:37 crc kubenswrapper[4764]: E1204 01:18:37.092240 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="extract-utilities" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.092248 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="extract-utilities" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.092498 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="62faedf4-c67e-40de-b79d-072a9ff896c4" containerName="registry-server" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.098257 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.102195 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lpkjv" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.107752 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.117637 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt"] Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.144067 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tcll6"] Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.146650 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.192596 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tcll6"] Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238492 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462adcd9-211b-4d5f-9ebc-4289708c9ee9-scripts\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238553 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5vt\" (UniqueName: \"kubernetes.io/projected/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-kube-api-access-zb5vt\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-scripts\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238640 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-run\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238701 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-log\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-log-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.238859 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82csl\" (UniqueName: \"kubernetes.io/projected/462adcd9-211b-4d5f-9ebc-4289708c9ee9-kube-api-access-82csl\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.239010 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.239065 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-etc-ovs\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.239155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-lib\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340734 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-lib\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340813 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462adcd9-211b-4d5f-9ebc-4289708c9ee9-scripts\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340833 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5vt\" (UniqueName: \"kubernetes.io/projected/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-kube-api-access-zb5vt\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-scripts\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340889 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-run\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340911 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-log\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340928 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-log-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340952 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.340987 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82csl\" (UniqueName: \"kubernetes.io/projected/462adcd9-211b-4d5f-9ebc-4289708c9ee9-kube-api-access-82csl\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-etc-ovs\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341109 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-lib\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341164 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-etc-ovs\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341186 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-log\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341219 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-log-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.341554 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/462adcd9-211b-4d5f-9ebc-4289708c9ee9-var-run-ovn\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.342110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-var-run\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.343079 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/462adcd9-211b-4d5f-9ebc-4289708c9ee9-scripts\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.343697 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-scripts\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.361549 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82csl\" (UniqueName: \"kubernetes.io/projected/462adcd9-211b-4d5f-9ebc-4289708c9ee9-kube-api-access-82csl\") pod \"ovn-controller-k6vxt\" (UID: \"462adcd9-211b-4d5f-9ebc-4289708c9ee9\") " pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.368947 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5vt\" (UniqueName: \"kubernetes.io/projected/af09cb3a-3fd3-47c6-ba05-a79b0c66efac-kube-api-access-zb5vt\") pod \"ovn-controller-ovs-tcll6\" (UID: \"af09cb3a-3fd3-47c6-ba05-a79b0c66efac\") " pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.450007 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.473207 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:37 crc kubenswrapper[4764]: I1204 01:18:37.939881 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt"] Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.322994 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tcll6"] Dec 04 01:18:38 crc kubenswrapper[4764]: W1204 01:18:38.325585 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf09cb3a_3fd3_47c6_ba05_a79b0c66efac.slice/crio-4b19872cfce9bed60baadd5fc3d49694fd5bfdf6ed2a733c3dcbb01d5d26c4d6 WatchSource:0}: Error finding container 4b19872cfce9bed60baadd5fc3d49694fd5bfdf6ed2a733c3dcbb01d5d26c4d6: Status 404 returned error can't find the container with id 4b19872cfce9bed60baadd5fc3d49694fd5bfdf6ed2a733c3dcbb01d5d26c4d6 Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.504791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt" event={"ID":"462adcd9-211b-4d5f-9ebc-4289708c9ee9","Type":"ContainerStarted","Data":"d8fb5346f87c0ee2dcbe85d5d240c7071eaca4380504b20ab4cd63cec8dfce5b"} Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.505044 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt" event={"ID":"462adcd9-211b-4d5f-9ebc-4289708c9ee9","Type":"ContainerStarted","Data":"398ecf90c9a9d1317a7c039c9def1a3128dcff2b5090a7551f406f0769b5ec79"} Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.505490 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-k6vxt" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.506673 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tcll6" event={"ID":"af09cb3a-3fd3-47c6-ba05-a79b0c66efac","Type":"ContainerStarted","Data":"4b19872cfce9bed60baadd5fc3d49694fd5bfdf6ed2a733c3dcbb01d5d26c4d6"} Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.529381 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-k6vxt" podStartSLOduration=1.529364642 podStartE2EDuration="1.529364642s" podCreationTimestamp="2025-12-04 01:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:18:38.52317963 +0000 UTC m=+5854.284504041" watchObservedRunningTime="2025-12-04 01:18:38.529364642 +0000 UTC m=+5854.290689053" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.684918 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-fn9rg"] Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.686124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.689085 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.701751 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fn9rg"] Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.773331 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovs-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.773417 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lckg\" (UniqueName: \"kubernetes.io/projected/3444b973-3337-4806-a444-4749df6c6fe9-kube-api-access-5lckg\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.773451 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3444b973-3337-4806-a444-4749df6c6fe9-config\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.773575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovn-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875265 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovs-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875318 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lckg\" (UniqueName: \"kubernetes.io/projected/3444b973-3337-4806-a444-4749df6c6fe9-kube-api-access-5lckg\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3444b973-3337-4806-a444-4749df6c6fe9-config\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875415 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovn-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875861 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovs-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.875915 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3444b973-3337-4806-a444-4749df6c6fe9-ovn-rundir\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.876404 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3444b973-3337-4806-a444-4749df6c6fe9-config\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:38 crc kubenswrapper[4764]: I1204 01:18:38.903612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lckg\" (UniqueName: \"kubernetes.io/projected/3444b973-3337-4806-a444-4749df6c6fe9-kube-api-access-5lckg\") pod \"ovn-controller-metrics-fn9rg\" (UID: \"3444b973-3337-4806-a444-4749df6c6fe9\") " pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.001695 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-fn9rg" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.484432 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-fn9rg"] Dec 04 01:18:39 crc kubenswrapper[4764]: W1204 01:18:39.485231 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3444b973_3337_4806_a444_4749df6c6fe9.slice/crio-c680b3cbb31263d70cd7eed61001a3f13c85ae8b4c574d85c3f82041acecee52 WatchSource:0}: Error finding container c680b3cbb31263d70cd7eed61001a3f13c85ae8b4c574d85c3f82041acecee52: Status 404 returned error can't find the container with id c680b3cbb31263d70cd7eed61001a3f13c85ae8b4c574d85c3f82041acecee52 Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.519584 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fn9rg" event={"ID":"3444b973-3337-4806-a444-4749df6c6fe9","Type":"ContainerStarted","Data":"c680b3cbb31263d70cd7eed61001a3f13c85ae8b4c574d85c3f82041acecee52"} Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.527007 4764 generic.go:334] "Generic (PLEG): container finished" podID="af09cb3a-3fd3-47c6-ba05-a79b0c66efac" containerID="aff1208c14ed1f39367b68c194105b29cfe9467eecbbe0cb07098fb08c4730ec" exitCode=0 Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.528550 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tcll6" event={"ID":"af09cb3a-3fd3-47c6-ba05-a79b0c66efac","Type":"ContainerDied","Data":"aff1208c14ed1f39367b68c194105b29cfe9467eecbbe0cb07098fb08c4730ec"} Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.561002 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-m2qsb"] Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.562596 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.580797 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-m2qsb"] Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.694008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k86z2\" (UniqueName: \"kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.694081 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.795941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k86z2\" (UniqueName: \"kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.795988 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.796942 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.815784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k86z2\" (UniqueName: \"kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2\") pod \"octavia-db-create-m2qsb\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:39 crc kubenswrapper[4764]: I1204 01:18:39.897096 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.078583 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vqk6r"] Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.109971 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vqk6r"] Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.350524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-m2qsb"] Dec 04 01:18:40 crc kubenswrapper[4764]: W1204 01:18:40.359611 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0170b467_741d_4b47_9835_34356313f145.slice/crio-8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb WatchSource:0}: Error finding container 8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb: Status 404 returned error can't find the container with id 8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.536603 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-fn9rg" event={"ID":"3444b973-3337-4806-a444-4749df6c6fe9","Type":"ContainerStarted","Data":"8b594722e0656635f98f643246ad292c8c1fdd69a5972f00b4898ec8ecd12fa7"} Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.540209 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-m2qsb" event={"ID":"0170b467-741d-4b47-9835-34356313f145","Type":"ContainerStarted","Data":"8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb"} Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.544111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tcll6" event={"ID":"af09cb3a-3fd3-47c6-ba05-a79b0c66efac","Type":"ContainerStarted","Data":"223d8796f1f8cba600c7959587337d2d16e5eee69a1c1572b077b068fed3c019"} Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.544220 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tcll6" event={"ID":"af09cb3a-3fd3-47c6-ba05-a79b0c66efac","Type":"ContainerStarted","Data":"014c0fbaf6b3fd47195208946cc46157fc32fdd280fd81604c3c3d8ebbadde52"} Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.544326 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.544358 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.559128 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-fn9rg" podStartSLOduration=2.559109606 podStartE2EDuration="2.559109606s" podCreationTimestamp="2025-12-04 01:18:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:18:40.554498083 +0000 UTC m=+5856.315822494" watchObservedRunningTime="2025-12-04 01:18:40.559109606 +0000 UTC m=+5856.320434007" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.588069 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db96a1e7-672e-4350-ad6f-c3802c61809c" path="/var/lib/kubelet/pods/db96a1e7-672e-4350-ad6f-c3802c61809c/volumes" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.589872 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tcll6" podStartSLOduration=3.589852273 podStartE2EDuration="3.589852273s" podCreationTimestamp="2025-12-04 01:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:18:40.585767952 +0000 UTC m=+5856.347092363" watchObservedRunningTime="2025-12-04 01:18:40.589852273 +0000 UTC m=+5856.351176684" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.838419 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-aa08-account-create-update-p85k9"] Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.839577 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.842001 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.848415 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-aa08-account-create-update-p85k9"] Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.920239 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:40 crc kubenswrapper[4764]: I1204 01:18:40.920536 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5pz8\" (UniqueName: \"kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.022799 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5pz8\" (UniqueName: \"kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.023148 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.023816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.060223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5pz8\" (UniqueName: \"kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8\") pod \"octavia-aa08-account-create-update-p85k9\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.160440 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.553696 4764 generic.go:334] "Generic (PLEG): container finished" podID="0170b467-741d-4b47-9835-34356313f145" containerID="83aa57f3e0388e223c20288b8b047d9643ab6a922bc1470e93173a66d30e2688" exitCode=0 Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.553780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-m2qsb" event={"ID":"0170b467-741d-4b47-9835-34356313f145","Type":"ContainerDied","Data":"83aa57f3e0388e223c20288b8b047d9643ab6a922bc1470e93173a66d30e2688"} Dec 04 01:18:41 crc kubenswrapper[4764]: I1204 01:18:41.688327 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-aa08-account-create-update-p85k9"] Dec 04 01:18:42 crc kubenswrapper[4764]: I1204 01:18:42.570062 4764 generic.go:334] "Generic (PLEG): container finished" podID="3f5ef1bd-61ec-47a9-829f-3214b0907f8e" containerID="a6dd8e8b88abc0a5f9cc045e7bcabb1c793b9c2cb341a9783f70a4be9b5da1dc" exitCode=0 Dec 04 01:18:42 crc kubenswrapper[4764]: I1204 01:18:42.570150 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-aa08-account-create-update-p85k9" event={"ID":"3f5ef1bd-61ec-47a9-829f-3214b0907f8e","Type":"ContainerDied","Data":"a6dd8e8b88abc0a5f9cc045e7bcabb1c793b9c2cb341a9783f70a4be9b5da1dc"} Dec 04 01:18:42 crc kubenswrapper[4764]: I1204 01:18:42.570563 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-aa08-account-create-update-p85k9" event={"ID":"3f5ef1bd-61ec-47a9-829f-3214b0907f8e","Type":"ContainerStarted","Data":"bf605bcd9e8bd3056575c99605c659109d23babb5c8cc07e622840e0210d6043"} Dec 04 01:18:42 crc kubenswrapper[4764]: I1204 01:18:42.992912 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.080969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts\") pod \"0170b467-741d-4b47-9835-34356313f145\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.081079 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k86z2\" (UniqueName: \"kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2\") pod \"0170b467-741d-4b47-9835-34356313f145\" (UID: \"0170b467-741d-4b47-9835-34356313f145\") " Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.082182 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0170b467-741d-4b47-9835-34356313f145" (UID: "0170b467-741d-4b47-9835-34356313f145"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.091082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2" (OuterVolumeSpecName: "kube-api-access-k86z2") pod "0170b467-741d-4b47-9835-34356313f145" (UID: "0170b467-741d-4b47-9835-34356313f145"). InnerVolumeSpecName "kube-api-access-k86z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.184127 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0170b467-741d-4b47-9835-34356313f145-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.184180 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k86z2\" (UniqueName: \"kubernetes.io/projected/0170b467-741d-4b47-9835-34356313f145-kube-api-access-k86z2\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.589701 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-m2qsb" Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.589890 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-m2qsb" event={"ID":"0170b467-741d-4b47-9835-34356313f145","Type":"ContainerDied","Data":"8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb"} Dec 04 01:18:43 crc kubenswrapper[4764]: I1204 01:18:43.589944 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e485877c591e525ed5712d5fe5a322f09297b070c38120cef00df9b4f6422fb" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.031628 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.102200 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts\") pod \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.102324 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5pz8\" (UniqueName: \"kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8\") pod \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\" (UID: \"3f5ef1bd-61ec-47a9-829f-3214b0907f8e\") " Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.103620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f5ef1bd-61ec-47a9-829f-3214b0907f8e" (UID: "3f5ef1bd-61ec-47a9-829f-3214b0907f8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.107892 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8" (OuterVolumeSpecName: "kube-api-access-g5pz8") pod "3f5ef1bd-61ec-47a9-829f-3214b0907f8e" (UID: "3f5ef1bd-61ec-47a9-829f-3214b0907f8e"). InnerVolumeSpecName "kube-api-access-g5pz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.205851 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.205905 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5pz8\" (UniqueName: \"kubernetes.io/projected/3f5ef1bd-61ec-47a9-829f-3214b0907f8e-kube-api-access-g5pz8\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.605913 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-aa08-account-create-update-p85k9" event={"ID":"3f5ef1bd-61ec-47a9-829f-3214b0907f8e","Type":"ContainerDied","Data":"bf605bcd9e8bd3056575c99605c659109d23babb5c8cc07e622840e0210d6043"} Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.605947 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf605bcd9e8bd3056575c99605c659109d23babb5c8cc07e622840e0210d6043" Dec 04 01:18:44 crc kubenswrapper[4764]: I1204 01:18:44.606032 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-aa08-account-create-update-p85k9" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.522817 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-wt4kx"] Dec 04 01:18:46 crc kubenswrapper[4764]: E1204 01:18:46.523735 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0170b467-741d-4b47-9835-34356313f145" containerName="mariadb-database-create" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.523749 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0170b467-741d-4b47-9835-34356313f145" containerName="mariadb-database-create" Dec 04 01:18:46 crc kubenswrapper[4764]: E1204 01:18:46.523770 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f5ef1bd-61ec-47a9-829f-3214b0907f8e" containerName="mariadb-account-create-update" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.523778 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f5ef1bd-61ec-47a9-829f-3214b0907f8e" containerName="mariadb-account-create-update" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.523959 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0170b467-741d-4b47-9835-34356313f145" containerName="mariadb-database-create" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.523974 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f5ef1bd-61ec-47a9-829f-3214b0907f8e" containerName="mariadb-account-create-update" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.524555 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.568368 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-wt4kx"] Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.659011 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xrnj\" (UniqueName: \"kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.659204 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.761815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xrnj\" (UniqueName: \"kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.761940 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.763418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.786439 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xrnj\" (UniqueName: \"kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj\") pod \"octavia-persistence-db-create-wt4kx\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:46 crc kubenswrapper[4764]: I1204 01:18:46.864980 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.351819 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-wt4kx"] Dec 04 01:18:47 crc kubenswrapper[4764]: W1204 01:18:47.355928 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7face56a_a8b7_41d2_86db_75135a9dcaa0.slice/crio-5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6 WatchSource:0}: Error finding container 5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6: Status 404 returned error can't find the container with id 5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6 Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.531596 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-2042-account-create-update-zv9mp"] Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.533026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.539819 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.541250 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2042-account-create-update-zv9mp"] Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.576002 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7tl\" (UniqueName: \"kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.576092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.634858 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wt4kx" event={"ID":"7face56a-a8b7-41d2-86db-75135a9dcaa0","Type":"ContainerStarted","Data":"3129539e767f0a639eabe362500bd0499b3098aaba4b42cdbc80d8ad192059f7"} Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.634904 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wt4kx" event={"ID":"7face56a-a8b7-41d2-86db-75135a9dcaa0","Type":"ContainerStarted","Data":"5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6"} Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.658453 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-persistence-db-create-wt4kx" podStartSLOduration=1.658433606 podStartE2EDuration="1.658433606s" podCreationTimestamp="2025-12-04 01:18:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:18:47.650161232 +0000 UTC m=+5863.411485643" watchObservedRunningTime="2025-12-04 01:18:47.658433606 +0000 UTC m=+5863.419758047" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.677202 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.677424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7tl\" (UniqueName: \"kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.679111 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.710991 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7tl\" (UniqueName: \"kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl\") pod \"octavia-2042-account-create-update-zv9mp\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:47 crc kubenswrapper[4764]: I1204 01:18:47.856438 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.348844 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-2042-account-create-update-zv9mp"] Dec 04 01:18:48 crc kubenswrapper[4764]: W1204 01:18:48.358252 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod505e42c6_8666_4acf_adf8_8ec135624e26.slice/crio-a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68 WatchSource:0}: Error finding container a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68: Status 404 returned error can't find the container with id a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68 Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.646587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2042-account-create-update-zv9mp" event={"ID":"505e42c6-8666-4acf-adf8-8ec135624e26","Type":"ContainerStarted","Data":"64defe28e1b50bea444fc2f226168b0d562b2a0cedcf8264d7a8fb01ec5c76bf"} Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.647022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2042-account-create-update-zv9mp" event={"ID":"505e42c6-8666-4acf-adf8-8ec135624e26","Type":"ContainerStarted","Data":"a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68"} Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.650205 4764 generic.go:334] "Generic (PLEG): container finished" podID="7face56a-a8b7-41d2-86db-75135a9dcaa0" containerID="3129539e767f0a639eabe362500bd0499b3098aaba4b42cdbc80d8ad192059f7" exitCode=0 Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.650256 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wt4kx" event={"ID":"7face56a-a8b7-41d2-86db-75135a9dcaa0","Type":"ContainerDied","Data":"3129539e767f0a639eabe362500bd0499b3098aaba4b42cdbc80d8ad192059f7"} Dec 04 01:18:48 crc kubenswrapper[4764]: I1204 01:18:48.684471 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-2042-account-create-update-zv9mp" podStartSLOduration=1.684447093 podStartE2EDuration="1.684447093s" podCreationTimestamp="2025-12-04 01:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:18:48.665539947 +0000 UTC m=+5864.426864368" watchObservedRunningTime="2025-12-04 01:18:48.684447093 +0000 UTC m=+5864.445771504" Dec 04 01:18:49 crc kubenswrapper[4764]: I1204 01:18:49.666886 4764 generic.go:334] "Generic (PLEG): container finished" podID="505e42c6-8666-4acf-adf8-8ec135624e26" containerID="64defe28e1b50bea444fc2f226168b0d562b2a0cedcf8264d7a8fb01ec5c76bf" exitCode=0 Dec 04 01:18:49 crc kubenswrapper[4764]: I1204 01:18:49.666997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2042-account-create-update-zv9mp" event={"ID":"505e42c6-8666-4acf-adf8-8ec135624e26","Type":"ContainerDied","Data":"64defe28e1b50bea444fc2f226168b0d562b2a0cedcf8264d7a8fb01ec5c76bf"} Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.120981 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.231707 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xrnj\" (UniqueName: \"kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj\") pod \"7face56a-a8b7-41d2-86db-75135a9dcaa0\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.232004 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts\") pod \"7face56a-a8b7-41d2-86db-75135a9dcaa0\" (UID: \"7face56a-a8b7-41d2-86db-75135a9dcaa0\") " Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.232524 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7face56a-a8b7-41d2-86db-75135a9dcaa0" (UID: "7face56a-a8b7-41d2-86db-75135a9dcaa0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.237565 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj" (OuterVolumeSpecName: "kube-api-access-2xrnj") pod "7face56a-a8b7-41d2-86db-75135a9dcaa0" (UID: "7face56a-a8b7-41d2-86db-75135a9dcaa0"). InnerVolumeSpecName "kube-api-access-2xrnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.334185 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7face56a-a8b7-41d2-86db-75135a9dcaa0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.334224 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xrnj\" (UniqueName: \"kubernetes.io/projected/7face56a-a8b7-41d2-86db-75135a9dcaa0-kube-api-access-2xrnj\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.690117 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-wt4kx" Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.691667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-wt4kx" event={"ID":"7face56a-a8b7-41d2-86db-75135a9dcaa0","Type":"ContainerDied","Data":"5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6"} Dec 04 01:18:50 crc kubenswrapper[4764]: I1204 01:18:50.691883 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5babec3a82c1fbe351951aecd33772d7839bb34b2b5d5910f113de57614894b6" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.144902 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.266570 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts\") pod \"505e42c6-8666-4acf-adf8-8ec135624e26\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.266846 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7tl\" (UniqueName: \"kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl\") pod \"505e42c6-8666-4acf-adf8-8ec135624e26\" (UID: \"505e42c6-8666-4acf-adf8-8ec135624e26\") " Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.267820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "505e42c6-8666-4acf-adf8-8ec135624e26" (UID: "505e42c6-8666-4acf-adf8-8ec135624e26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.275964 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl" (OuterVolumeSpecName: "kube-api-access-xm7tl") pod "505e42c6-8666-4acf-adf8-8ec135624e26" (UID: "505e42c6-8666-4acf-adf8-8ec135624e26"). InnerVolumeSpecName "kube-api-access-xm7tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.370065 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7tl\" (UniqueName: \"kubernetes.io/projected/505e42c6-8666-4acf-adf8-8ec135624e26-kube-api-access-xm7tl\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.370099 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/505e42c6-8666-4acf-adf8-8ec135624e26-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.708425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-2042-account-create-update-zv9mp" event={"ID":"505e42c6-8666-4acf-adf8-8ec135624e26","Type":"ContainerDied","Data":"a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68"} Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.708473 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a36c58636a2657ccb4491324fc5eebdcffcac68f96eb367a8cb361161b3e2a68" Dec 04 01:18:51 crc kubenswrapper[4764]: I1204 01:18:51.708563 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-2042-account-create-update-zv9mp" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.601020 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-58457b4d67-ncwfc"] Dec 04 01:18:53 crc kubenswrapper[4764]: E1204 01:18:53.601827 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="505e42c6-8666-4acf-adf8-8ec135624e26" containerName="mariadb-account-create-update" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.601844 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="505e42c6-8666-4acf-adf8-8ec135624e26" containerName="mariadb-account-create-update" Dec 04 01:18:53 crc kubenswrapper[4764]: E1204 01:18:53.601896 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7face56a-a8b7-41d2-86db-75135a9dcaa0" containerName="mariadb-database-create" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.601905 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7face56a-a8b7-41d2-86db-75135a9dcaa0" containerName="mariadb-database-create" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.602157 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="505e42c6-8666-4acf-adf8-8ec135624e26" containerName="mariadb-account-create-update" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.602184 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7face56a-a8b7-41d2-86db-75135a9dcaa0" containerName="mariadb-database-create" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.603924 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.607805 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.607981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-lgwlt" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.608061 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.625812 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-58457b4d67-ncwfc"] Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.725471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-octavia-run\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.725710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-combined-ca-bundle\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.725842 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data-merged\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.725988 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-scripts\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.726139 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.827382 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.827557 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-octavia-run\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.827593 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-combined-ca-bundle\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.827623 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data-merged\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.827656 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-scripts\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.828572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-octavia-run\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.828581 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data-merged\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.833223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-scripts\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.834369 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-combined-ca-bundle\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.834399 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8703e4e1-060c-47a1-b9ce-99de3d89fe80-config-data\") pod \"octavia-api-58457b4d67-ncwfc\" (UID: \"8703e4e1-060c-47a1-b9ce-99de3d89fe80\") " pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:53 crc kubenswrapper[4764]: I1204 01:18:53.926950 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:18:54 crc kubenswrapper[4764]: I1204 01:18:54.038460 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9vcrj"] Dec 04 01:18:54 crc kubenswrapper[4764]: I1204 01:18:54.053434 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9vcrj"] Dec 04 01:18:54 crc kubenswrapper[4764]: I1204 01:18:54.467469 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-58457b4d67-ncwfc"] Dec 04 01:18:54 crc kubenswrapper[4764]: I1204 01:18:54.557057 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9ea037-924a-490f-ac1b-96d82921c1fb" path="/var/lib/kubelet/pods/5d9ea037-924a-490f-ac1b-96d82921c1fb/volumes" Dec 04 01:18:54 crc kubenswrapper[4764]: I1204 01:18:54.735978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-58457b4d67-ncwfc" event={"ID":"8703e4e1-060c-47a1-b9ce-99de3d89fe80","Type":"ContainerStarted","Data":"a67000bbc50e6cd69df12cd4d58575932b1877b1de0dec69203d658a89d3705f"} Dec 04 01:19:02 crc kubenswrapper[4764]: I1204 01:19:02.852231 4764 generic.go:334] "Generic (PLEG): container finished" podID="8703e4e1-060c-47a1-b9ce-99de3d89fe80" containerID="ee8e875bc59b73d5bbeda996299b4c2d12a1e2f04d727559aa69dc5ab17a4262" exitCode=0 Dec 04 01:19:02 crc kubenswrapper[4764]: I1204 01:19:02.852766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-58457b4d67-ncwfc" event={"ID":"8703e4e1-060c-47a1-b9ce-99de3d89fe80","Type":"ContainerDied","Data":"ee8e875bc59b73d5bbeda996299b4c2d12a1e2f04d727559aa69dc5ab17a4262"} Dec 04 01:19:03 crc kubenswrapper[4764]: I1204 01:19:03.867485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-58457b4d67-ncwfc" event={"ID":"8703e4e1-060c-47a1-b9ce-99de3d89fe80","Type":"ContainerStarted","Data":"d1619cfbd3dfeb989bc2079ea38b63e54c2a7e0d5cc3bbc01aafcb56fd30891b"} Dec 04 01:19:03 crc kubenswrapper[4764]: I1204 01:19:03.867827 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:19:03 crc kubenswrapper[4764]: I1204 01:19:03.867849 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:19:03 crc kubenswrapper[4764]: I1204 01:19:03.867868 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-58457b4d67-ncwfc" event={"ID":"8703e4e1-060c-47a1-b9ce-99de3d89fe80","Type":"ContainerStarted","Data":"668f57f847d5d8e52de43e5941e1a1df2eb11ecab9dbabe1a791783216b3826f"} Dec 04 01:19:03 crc kubenswrapper[4764]: I1204 01:19:03.910206 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-58457b4d67-ncwfc" podStartSLOduration=2.920687133 podStartE2EDuration="10.910186925s" podCreationTimestamp="2025-12-04 01:18:53 +0000 UTC" firstStartedPulling="2025-12-04 01:18:54.467184582 +0000 UTC m=+5870.228508993" lastFinishedPulling="2025-12-04 01:19:02.456684374 +0000 UTC m=+5878.218008785" observedRunningTime="2025-12-04 01:19:03.894253332 +0000 UTC m=+5879.655577743" watchObservedRunningTime="2025-12-04 01:19:03.910186925 +0000 UTC m=+5879.671511346" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.104366 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-wqtrg"] Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.108138 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.113584 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.113934 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.114184 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.128850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wqtrg"] Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.268664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-scripts\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.268776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data-merged\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.268907 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.269021 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/becbf2e8-4899-4a6e-893c-84ffd4617c27-hm-ports\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.371299 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-scripts\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.371381 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data-merged\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.371518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.371630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/becbf2e8-4899-4a6e-893c-84ffd4617c27-hm-ports\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.373266 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/becbf2e8-4899-4a6e-893c-84ffd4617c27-hm-ports\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.373884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data-merged\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.380784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-scripts\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.381868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/becbf2e8-4899-4a6e-893c-84ffd4617c27-config-data\") pod \"octavia-rsyslog-wqtrg\" (UID: \"becbf2e8-4899-4a6e-893c-84ffd4617c27\") " pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.440682 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.799881 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.802272 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.813314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.820632 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.885793 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.885904 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.992775 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.993463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:11 crc kubenswrapper[4764]: I1204 01:19:11.993967 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.006493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config\") pod \"octavia-image-upload-56c9f55b99-rsv44\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.007297 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-wqtrg"] Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.124974 4764 scope.go:117] "RemoveContainer" containerID="854a4bef5dc6e0fc8518e4ef4592640a0f782cb9a570bef229093b7030cc69cf" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.128668 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.188166 4764 scope.go:117] "RemoveContainer" containerID="9f118afdcc1fad1d9a4385b345c9048296d3f0e90089966e80a3e11a8ae4da0d" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.213254 4764 scope.go:117] "RemoveContainer" containerID="757fef3a8aa70238a69a8d8d417e4faf39147bdd5ee70299b9af5eb4ea4d2c7e" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.270970 4764 scope.go:117] "RemoveContainer" containerID="054ca7e537e8dcbed4eac89cc315a82e736a5cf163fb6cc4e51fa032c82b091b" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.528497 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-k6vxt" podUID="462adcd9-211b-4d5f-9ebc-4289708c9ee9" containerName="ovn-controller" probeResult="failure" output=< Dec 04 01:19:12 crc kubenswrapper[4764]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 01:19:12 crc kubenswrapper[4764]: > Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.571186 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.573053 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tcll6" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.691310 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k6vxt-config-5tjhs"] Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.693062 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.695956 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.758266 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.777240 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt-config-5tjhs"] Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844055 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844130 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844218 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v82qm\" (UniqueName: \"kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844258 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844282 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.844316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.945924 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946282 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v82qm\" (UniqueName: \"kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946427 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946457 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946508 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.946888 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.947744 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.947814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.950446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.950553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:12 crc kubenswrapper[4764]: I1204 01:19:12.970230 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v82qm\" (UniqueName: \"kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm\") pod \"ovn-controller-k6vxt-config-5tjhs\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:13 crc kubenswrapper[4764]: I1204 01:19:13.007517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerStarted","Data":"0b1c4b2231d5c33c1b3a899c1d1c2ec66457823402e2b36925b00ad660a17f5c"} Dec 04 01:19:13 crc kubenswrapper[4764]: I1204 01:19:13.025020 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wqtrg" event={"ID":"becbf2e8-4899-4a6e-893c-84ffd4617c27","Type":"ContainerStarted","Data":"05233d657f8111cde160e460c19d430045475bf9bcbef4c45e04b7bc73bfda40"} Dec 04 01:19:13 crc kubenswrapper[4764]: I1204 01:19:13.066181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:13 crc kubenswrapper[4764]: I1204 01:19:13.251914 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:19:13 crc kubenswrapper[4764]: I1204 01:19:13.346965 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-58457b4d67-ncwfc" Dec 04 01:19:14 crc kubenswrapper[4764]: I1204 01:19:14.154749 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt-config-5tjhs"] Dec 04 01:19:15 crc kubenswrapper[4764]: I1204 01:19:15.049985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-5tjhs" event={"ID":"b4d86e55-df77-455d-97b9-9ee53111230f","Type":"ContainerStarted","Data":"0b7a763a62d6a903ede3dc5898b6643c6dffe366ea029af41c6c79cfd6592e04"} Dec 04 01:19:15 crc kubenswrapper[4764]: I1204 01:19:15.052710 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wqtrg" event={"ID":"becbf2e8-4899-4a6e-893c-84ffd4617c27","Type":"ContainerStarted","Data":"68c71f514df71a3218706cfaab5fc37092fa74dce3d8ac7b835ed68b784c91b2"} Dec 04 01:19:16 crc kubenswrapper[4764]: I1204 01:19:16.067352 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-5tjhs" event={"ID":"b4d86e55-df77-455d-97b9-9ee53111230f","Type":"ContainerStarted","Data":"7b52487e1c43319a44c0b96d4caed9abfc8996deb92b33a530ca4df7e8a61b41"} Dec 04 01:19:16 crc kubenswrapper[4764]: I1204 01:19:16.092781 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-k6vxt-config-5tjhs" podStartSLOduration=4.092762614 podStartE2EDuration="4.092762614s" podCreationTimestamp="2025-12-04 01:19:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:19:16.08525729 +0000 UTC m=+5891.846581711" watchObservedRunningTime="2025-12-04 01:19:16.092762614 +0000 UTC m=+5891.854087035" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.076687 4764 generic.go:334] "Generic (PLEG): container finished" podID="becbf2e8-4899-4a6e-893c-84ffd4617c27" containerID="68c71f514df71a3218706cfaab5fc37092fa74dce3d8ac7b835ed68b784c91b2" exitCode=0 Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.076764 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wqtrg" event={"ID":"becbf2e8-4899-4a6e-893c-84ffd4617c27","Type":"ContainerDied","Data":"68c71f514df71a3218706cfaab5fc37092fa74dce3d8ac7b835ed68b784c91b2"} Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.079880 4764 generic.go:334] "Generic (PLEG): container finished" podID="b4d86e55-df77-455d-97b9-9ee53111230f" containerID="7b52487e1c43319a44c0b96d4caed9abfc8996deb92b33a530ca4df7e8a61b41" exitCode=0 Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.079924 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-5tjhs" event={"ID":"b4d86e55-df77-455d-97b9-9ee53111230f","Type":"ContainerDied","Data":"7b52487e1c43319a44c0b96d4caed9abfc8996deb92b33a530ca4df7e8a61b41"} Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.473561 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-ld9jx"] Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.475680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.479010 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.487227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-ld9jx"] Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.532625 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-k6vxt" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.576387 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.576456 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.576710 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.576864 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.678973 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.679887 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.680262 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.680757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.682167 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.690701 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.699623 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.701966 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data\") pod \"octavia-db-sync-ld9jx\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:17 crc kubenswrapper[4764]: I1204 01:19:17.814379 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:18 crc kubenswrapper[4764]: I1204 01:19:18.285656 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-ld9jx"] Dec 04 01:19:19 crc kubenswrapper[4764]: W1204 01:19:19.676130 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb221cfbc_e72d_4f2b_80b5_d32c11e2f963.slice/crio-497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb WatchSource:0}: Error finding container 497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb: Status 404 returned error can't find the container with id 497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.789188 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.934374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.934915 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.934952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.934992 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935040 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v82qm\" (UniqueName: \"kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935125 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935579 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn\") pod \"b4d86e55-df77-455d-97b9-9ee53111230f\" (UID: \"b4d86e55-df77-455d-97b9-9ee53111230f\") " Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935888 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935948 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run" (OuterVolumeSpecName: "var-run") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935988 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts" (OuterVolumeSpecName: "scripts") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.935994 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.936690 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.936710 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.936740 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.936754 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b4d86e55-df77-455d-97b9-9ee53111230f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.936767 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b4d86e55-df77-455d-97b9-9ee53111230f-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:19 crc kubenswrapper[4764]: I1204 01:19:19.942087 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm" (OuterVolumeSpecName: "kube-api-access-v82qm") pod "b4d86e55-df77-455d-97b9-9ee53111230f" (UID: "b4d86e55-df77-455d-97b9-9ee53111230f"). InnerVolumeSpecName "kube-api-access-v82qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.038506 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v82qm\" (UniqueName: \"kubernetes.io/projected/b4d86e55-df77-455d-97b9-9ee53111230f-kube-api-access-v82qm\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.124288 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerStarted","Data":"497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb"} Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.126906 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-5tjhs" event={"ID":"b4d86e55-df77-455d-97b9-9ee53111230f","Type":"ContainerDied","Data":"0b7a763a62d6a903ede3dc5898b6643c6dffe366ea029af41c6c79cfd6592e04"} Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.126934 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b7a763a62d6a903ede3dc5898b6643c6dffe366ea029af41c6c79cfd6592e04" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.126987 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-5tjhs" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.872980 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-k6vxt-config-5tjhs"] Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.884756 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-k6vxt-config-5tjhs"] Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.983960 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-k6vxt-config-7dgg7"] Dec 04 01:19:20 crc kubenswrapper[4764]: E1204 01:19:20.992580 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4d86e55-df77-455d-97b9-9ee53111230f" containerName="ovn-config" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.992616 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4d86e55-df77-455d-97b9-9ee53111230f" containerName="ovn-config" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.994214 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4d86e55-df77-455d-97b9-9ee53111230f" containerName="ovn-config" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.995003 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:20 crc kubenswrapper[4764]: I1204 01:19:20.997282 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.010159 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt-config-7dgg7"] Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172457 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6qv\" (UniqueName: \"kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172831 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.172967 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.275020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6qv\" (UniqueName: \"kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.275422 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276153 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276214 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276455 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276510 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276572 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.276610 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.277014 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.277205 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.278882 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.292339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6qv\" (UniqueName: \"kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv\") pod \"ovn-controller-k6vxt-config-7dgg7\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:21 crc kubenswrapper[4764]: I1204 01:19:21.324919 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:22 crc kubenswrapper[4764]: I1204 01:19:22.556017 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4d86e55-df77-455d-97b9-9ee53111230f" path="/var/lib/kubelet/pods/b4d86e55-df77-455d-97b9-9ee53111230f/volumes" Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.183801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerStarted","Data":"ba12160a89b51d479a52afbdb3a714cd6f2787d3b90e2f0bd0e08e12140d0fa0"} Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.188364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-wqtrg" event={"ID":"becbf2e8-4899-4a6e-893c-84ffd4617c27","Type":"ContainerStarted","Data":"f4e83ccdbe1e60abb937d739e69440b036c1045e7f5d5a7a096707f91c40b7f6"} Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.189382 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.193910 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerStarted","Data":"3abb55e315453663e15ef4b9a85d5f9d785a4a06bef7407ecae0db2141dce5df"} Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.243659 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-wqtrg" podStartSLOduration=1.469629187 podStartE2EDuration="13.24364221s" podCreationTimestamp="2025-12-04 01:19:11 +0000 UTC" firstStartedPulling="2025-12-04 01:19:12.033544172 +0000 UTC m=+5887.794868583" lastFinishedPulling="2025-12-04 01:19:23.807557195 +0000 UTC m=+5899.568881606" observedRunningTime="2025-12-04 01:19:24.22580263 +0000 UTC m=+5899.987127061" watchObservedRunningTime="2025-12-04 01:19:24.24364221 +0000 UTC m=+5900.004966621" Dec 04 01:19:24 crc kubenswrapper[4764]: I1204 01:19:24.258283 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-k6vxt-config-7dgg7"] Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.205778 4764 generic.go:334] "Generic (PLEG): container finished" podID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerID="3abb55e315453663e15ef4b9a85d5f9d785a4a06bef7407ecae0db2141dce5df" exitCode=0 Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.205879 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerDied","Data":"3abb55e315453663e15ef4b9a85d5f9d785a4a06bef7407ecae0db2141dce5df"} Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.212740 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f940d19-5555-4701-8c57-34396792e9cb" containerID="ba12160a89b51d479a52afbdb3a714cd6f2787d3b90e2f0bd0e08e12140d0fa0" exitCode=0 Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.212796 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerDied","Data":"ba12160a89b51d479a52afbdb3a714cd6f2787d3b90e2f0bd0e08e12140d0fa0"} Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.215763 4764 generic.go:334] "Generic (PLEG): container finished" podID="67cbce5b-affe-4e6a-a146-fd4f1a30c951" containerID="888c0b683b0853fbb70012ad304a4b9bf88f289c8958101f0b312cccdc96942a" exitCode=0 Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.215920 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-7dgg7" event={"ID":"67cbce5b-affe-4e6a-a146-fd4f1a30c951","Type":"ContainerDied","Data":"888c0b683b0853fbb70012ad304a4b9bf88f289c8958101f0b312cccdc96942a"} Dec 04 01:19:25 crc kubenswrapper[4764]: I1204 01:19:25.216013 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-7dgg7" event={"ID":"67cbce5b-affe-4e6a-a146-fd4f1a30c951","Type":"ContainerStarted","Data":"33df227f67c5a151f1d6ba2083349460612d0871f474aa6ec7bb7f52bcfd799f"} Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.228520 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerStarted","Data":"a7a23246d03c143486966bb236268e1f1dc091660e349e2e3494d069d1434159"} Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.233146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerStarted","Data":"a5fb2555544c9f0a842c11bc4884b591fde087d60592aec23077352ace2d3526"} Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.263406 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-ld9jx" podStartSLOduration=9.263385419 podStartE2EDuration="9.263385419s" podCreationTimestamp="2025-12-04 01:19:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:19:26.251279691 +0000 UTC m=+5902.012604112" watchObservedRunningTime="2025-12-04 01:19:26.263385419 +0000 UTC m=+5902.024709840" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.287782 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" podStartSLOduration=4.057289616 podStartE2EDuration="15.287750449s" podCreationTimestamp="2025-12-04 01:19:11 +0000 UTC" firstStartedPulling="2025-12-04 01:19:12.745974119 +0000 UTC m=+5888.507298530" lastFinishedPulling="2025-12-04 01:19:23.976434932 +0000 UTC m=+5899.737759363" observedRunningTime="2025-12-04 01:19:26.271342555 +0000 UTC m=+5902.032666966" watchObservedRunningTime="2025-12-04 01:19:26.287750449 +0000 UTC m=+5902.049074870" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.641866 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790379 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790527 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790674 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790704 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6qv\" (UniqueName: \"kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790819 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts\") pod \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\" (UID: \"67cbce5b-affe-4e6a-a146-fd4f1a30c951\") " Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.791267 4764 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790599 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run" (OuterVolumeSpecName: "var-run") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.790622 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.791611 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.792255 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts" (OuterVolumeSpecName: "scripts") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.800693 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv" (OuterVolumeSpecName: "kube-api-access-2q6qv") pod "67cbce5b-affe-4e6a-a146-fd4f1a30c951" (UID: "67cbce5b-affe-4e6a-a146-fd4f1a30c951"). InnerVolumeSpecName "kube-api-access-2q6qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.893160 4764 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.893198 4764 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/67cbce5b-affe-4e6a-a146-fd4f1a30c951-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.893209 4764 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.893223 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6qv\" (UniqueName: \"kubernetes.io/projected/67cbce5b-affe-4e6a-a146-fd4f1a30c951-kube-api-access-2q6qv\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:26 crc kubenswrapper[4764]: I1204 01:19:26.893237 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67cbce5b-affe-4e6a-a146-fd4f1a30c951-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:27 crc kubenswrapper[4764]: I1204 01:19:27.256938 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-k6vxt-config-7dgg7" event={"ID":"67cbce5b-affe-4e6a-a146-fd4f1a30c951","Type":"ContainerDied","Data":"33df227f67c5a151f1d6ba2083349460612d0871f474aa6ec7bb7f52bcfd799f"} Dec 04 01:19:27 crc kubenswrapper[4764]: I1204 01:19:27.257285 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33df227f67c5a151f1d6ba2083349460612d0871f474aa6ec7bb7f52bcfd799f" Dec 04 01:19:27 crc kubenswrapper[4764]: I1204 01:19:27.257488 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-k6vxt-config-7dgg7" Dec 04 01:19:27 crc kubenswrapper[4764]: I1204 01:19:27.739658 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-k6vxt-config-7dgg7"] Dec 04 01:19:27 crc kubenswrapper[4764]: I1204 01:19:27.750282 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-k6vxt-config-7dgg7"] Dec 04 01:19:28 crc kubenswrapper[4764]: I1204 01:19:28.556805 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67cbce5b-affe-4e6a-a146-fd4f1a30c951" path="/var/lib/kubelet/pods/67cbce5b-affe-4e6a-a146-fd4f1a30c951/volumes" Dec 04 01:19:31 crc kubenswrapper[4764]: I1204 01:19:31.299191 4764 generic.go:334] "Generic (PLEG): container finished" podID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerID="a7a23246d03c143486966bb236268e1f1dc091660e349e2e3494d069d1434159" exitCode=0 Dec 04 01:19:31 crc kubenswrapper[4764]: I1204 01:19:31.299212 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerDied","Data":"a7a23246d03c143486966bb236268e1f1dc091660e349e2e3494d069d1434159"} Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.731351 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.926875 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data\") pod \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.927291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts\") pod \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.927394 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle\") pod \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.927474 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged\") pod \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\" (UID: \"b221cfbc-e72d-4f2b-80b5-d32c11e2f963\") " Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.933598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts" (OuterVolumeSpecName: "scripts") pod "b221cfbc-e72d-4f2b-80b5-d32c11e2f963" (UID: "b221cfbc-e72d-4f2b-80b5-d32c11e2f963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.940075 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data" (OuterVolumeSpecName: "config-data") pod "b221cfbc-e72d-4f2b-80b5-d32c11e2f963" (UID: "b221cfbc-e72d-4f2b-80b5-d32c11e2f963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.958512 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "b221cfbc-e72d-4f2b-80b5-d32c11e2f963" (UID: "b221cfbc-e72d-4f2b-80b5-d32c11e2f963"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:19:32 crc kubenswrapper[4764]: I1204 01:19:32.975962 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b221cfbc-e72d-4f2b-80b5-d32c11e2f963" (UID: "b221cfbc-e72d-4f2b-80b5-d32c11e2f963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.030563 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.030623 4764 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.030644 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.030662 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b221cfbc-e72d-4f2b-80b5-d32c11e2f963-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.326977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-ld9jx" event={"ID":"b221cfbc-e72d-4f2b-80b5-d32c11e2f963","Type":"ContainerDied","Data":"497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb"} Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.327026 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="497bb5ef196f543046aa18bdf8ac78a3f6e4679bbe7b29b0d27aa282e1da18bb" Dec 04 01:19:33 crc kubenswrapper[4764]: I1204 01:19:33.327080 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-ld9jx" Dec 04 01:19:41 crc kubenswrapper[4764]: I1204 01:19:41.476464 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-wqtrg" Dec 04 01:19:56 crc kubenswrapper[4764]: I1204 01:19:56.087147 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:56 crc kubenswrapper[4764]: I1204 01:19:56.087997 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="octavia-amphora-httpd" containerID="cri-o://a5fb2555544c9f0a842c11bc4884b591fde087d60592aec23077352ace2d3526" gracePeriod=30 Dec 04 01:19:56 crc kubenswrapper[4764]: I1204 01:19:56.560329 4764 generic.go:334] "Generic (PLEG): container finished" podID="5f940d19-5555-4701-8c57-34396792e9cb" containerID="a5fb2555544c9f0a842c11bc4884b591fde087d60592aec23077352ace2d3526" exitCode=0 Dec 04 01:19:56 crc kubenswrapper[4764]: I1204 01:19:56.560408 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerDied","Data":"a5fb2555544c9f0a842c11bc4884b591fde087d60592aec23077352ace2d3526"} Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.265796 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.321527 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image\") pod \"5f940d19-5555-4701-8c57-34396792e9cb\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.321833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config\") pod \"5f940d19-5555-4701-8c57-34396792e9cb\" (UID: \"5f940d19-5555-4701-8c57-34396792e9cb\") " Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.366614 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5f940d19-5555-4701-8c57-34396792e9cb" (UID: "5f940d19-5555-4701-8c57-34396792e9cb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.377737 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "5f940d19-5555-4701-8c57-34396792e9cb" (UID: "5f940d19-5555-4701-8c57-34396792e9cb"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.425551 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5f940d19-5555-4701-8c57-34396792e9cb-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.425579 4764 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/5f940d19-5555-4701-8c57-34396792e9cb-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.572082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" event={"ID":"5f940d19-5555-4701-8c57-34396792e9cb","Type":"ContainerDied","Data":"0b1c4b2231d5c33c1b3a899c1d1c2ec66457823402e2b36925b00ad660a17f5c"} Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.572127 4764 scope.go:117] "RemoveContainer" containerID="a5fb2555544c9f0a842c11bc4884b591fde087d60592aec23077352ace2d3526" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.572383 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-rsv44" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.611893 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.614546 4764 scope.go:117] "RemoveContainer" containerID="ba12160a89b51d479a52afbdb3a714cd6f2787d3b90e2f0bd0e08e12140d0fa0" Dec 04 01:19:57 crc kubenswrapper[4764]: I1204 01:19:57.626523 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-rsv44"] Dec 04 01:19:58 crc kubenswrapper[4764]: I1204 01:19:58.560679 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f940d19-5555-4701-8c57-34396792e9cb" path="/var/lib/kubelet/pods/5f940d19-5555-4701-8c57-34396792e9cb/volumes" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.230261 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-5tszs"] Dec 04 01:20:01 crc kubenswrapper[4764]: E1204 01:20:01.231278 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="octavia-amphora-httpd" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231297 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="octavia-amphora-httpd" Dec 04 01:20:01 crc kubenswrapper[4764]: E1204 01:20:01.231331 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerName="octavia-db-sync" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231340 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerName="octavia-db-sync" Dec 04 01:20:01 crc kubenswrapper[4764]: E1204 01:20:01.231352 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerName="init" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231359 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerName="init" Dec 04 01:20:01 crc kubenswrapper[4764]: E1204 01:20:01.231378 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="init" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231385 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="init" Dec 04 01:20:01 crc kubenswrapper[4764]: E1204 01:20:01.231396 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67cbce5b-affe-4e6a-a146-fd4f1a30c951" containerName="ovn-config" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231403 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="67cbce5b-affe-4e6a-a146-fd4f1a30c951" containerName="ovn-config" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231610 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f940d19-5555-4701-8c57-34396792e9cb" containerName="octavia-amphora-httpd" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231636 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" containerName="octavia-db-sync" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.231657 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="67cbce5b-affe-4e6a-a146-fd4f1a30c951" containerName="ovn-config" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.232853 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.236333 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.245049 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-5tszs"] Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.344016 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/097194b1-f646-4a43-b58c-e2bfb03da583-httpd-config\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.344100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/097194b1-f646-4a43-b58c-e2bfb03da583-amphora-image\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.445969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/097194b1-f646-4a43-b58c-e2bfb03da583-httpd-config\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.446070 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/097194b1-f646-4a43-b58c-e2bfb03da583-amphora-image\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.446796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/097194b1-f646-4a43-b58c-e2bfb03da583-amphora-image\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.451512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/097194b1-f646-4a43-b58c-e2bfb03da583-httpd-config\") pod \"octavia-image-upload-56c9f55b99-5tszs\" (UID: \"097194b1-f646-4a43-b58c-e2bfb03da583\") " pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:01 crc kubenswrapper[4764]: I1204 01:20:01.555498 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" Dec 04 01:20:02 crc kubenswrapper[4764]: I1204 01:20:02.079341 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-56c9f55b99-5tszs"] Dec 04 01:20:02 crc kubenswrapper[4764]: I1204 01:20:02.675995 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" event={"ID":"097194b1-f646-4a43-b58c-e2bfb03da583","Type":"ContainerStarted","Data":"93fc06c0b9efacfc3d3e7b3526df13d94eae4df5cd2d3ba7c64097a66ea63330"} Dec 04 01:20:03 crc kubenswrapper[4764]: I1204 01:20:03.687688 4764 generic.go:334] "Generic (PLEG): container finished" podID="097194b1-f646-4a43-b58c-e2bfb03da583" containerID="0f03d1a00909de99d9912d62a3014bcd4b6c1858ce349178c4e6e7a8c6418e26" exitCode=0 Dec 04 01:20:03 crc kubenswrapper[4764]: I1204 01:20:03.687790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" event={"ID":"097194b1-f646-4a43-b58c-e2bfb03da583","Type":"ContainerDied","Data":"0f03d1a00909de99d9912d62a3014bcd4b6c1858ce349178c4e6e7a8c6418e26"} Dec 04 01:20:04 crc kubenswrapper[4764]: I1204 01:20:04.699480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" event={"ID":"097194b1-f646-4a43-b58c-e2bfb03da583","Type":"ContainerStarted","Data":"3be00051c45aa01ab70f09e9b6dcd62135dba02b786e44be654cc961aa7a5560"} Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.538932 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-56c9f55b99-5tszs" podStartSLOduration=16.054163312 podStartE2EDuration="16.538912004s" podCreationTimestamp="2025-12-04 01:20:01 +0000 UTC" firstStartedPulling="2025-12-04 01:20:02.061910327 +0000 UTC m=+5937.823234738" lastFinishedPulling="2025-12-04 01:20:02.546659009 +0000 UTC m=+5938.307983430" observedRunningTime="2025-12-04 01:20:04.71473524 +0000 UTC m=+5940.476059661" watchObservedRunningTime="2025-12-04 01:20:17.538912004 +0000 UTC m=+5953.300236415" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.550098 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-q9k9p"] Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.552537 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.557472 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.557703 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.557741 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.563797 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-q9k9p"] Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.678412 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-scripts\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.678818 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-combined-ca-bundle\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.679091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-amphora-certs\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.679249 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data-merged\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.679569 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.679618 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-hm-ports\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.781800 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-amphora-certs\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.781910 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data-merged\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.782021 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.782047 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-hm-ports\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.782098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-scripts\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.782141 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-combined-ca-bundle\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.783015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data-merged\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.783259 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-hm-ports\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.787676 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-amphora-certs\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.787714 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-combined-ca-bundle\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.787959 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-scripts\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.788358 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdd7c41d-eb6c-4fdc-8e37-a31282572e7d-config-data\") pod \"octavia-healthmanager-q9k9p\" (UID: \"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d\") " pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:17 crc kubenswrapper[4764]: I1204 01:20:17.884598 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:18 crc kubenswrapper[4764]: I1204 01:20:18.453699 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-q9k9p"] Dec 04 01:20:18 crc kubenswrapper[4764]: W1204 01:20:18.458377 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdd7c41d_eb6c_4fdc_8e37_a31282572e7d.slice/crio-e1cbd70e27aee16e15ab3bf08545f60660ecce4c33a540a6d1a5e4a3b268c419 WatchSource:0}: Error finding container e1cbd70e27aee16e15ab3bf08545f60660ecce4c33a540a6d1a5e4a3b268c419: Status 404 returned error can't find the container with id e1cbd70e27aee16e15ab3bf08545f60660ecce4c33a540a6d1a5e4a3b268c419 Dec 04 01:20:18 crc kubenswrapper[4764]: I1204 01:20:18.870744 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-q9k9p" event={"ID":"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d","Type":"ContainerStarted","Data":"e1cbd70e27aee16e15ab3bf08545f60660ecce4c33a540a6d1a5e4a3b268c419"} Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.484853 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-hfsz5"] Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.487351 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.489610 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.489947 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.508690 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-hfsz5"] Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.615775 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15380bf0-c6f1-47bd-9fe1-9062df838464-hm-ports\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.615836 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-combined-ca-bundle\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.615991 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.616050 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data-merged\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.616146 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-scripts\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.616650 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-amphora-certs\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719086 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15380bf0-c6f1-47bd-9fe1-9062df838464-hm-ports\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719156 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-combined-ca-bundle\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data-merged\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719438 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-scripts\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.719505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-amphora-certs\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.720674 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data-merged\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.722011 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/15380bf0-c6f1-47bd-9fe1-9062df838464-hm-ports\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.725526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-combined-ca-bundle\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.728026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-scripts\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.728896 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-amphora-certs\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.729837 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15380bf0-c6f1-47bd-9fe1-9062df838464-config-data\") pod \"octavia-housekeeping-hfsz5\" (UID: \"15380bf0-c6f1-47bd-9fe1-9062df838464\") " pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.803459 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:19 crc kubenswrapper[4764]: I1204 01:20:19.888774 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-q9k9p" event={"ID":"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d","Type":"ContainerStarted","Data":"666ef5ef9a9af1e16417a937607554f5f6710a692427599024b2d225f67de72e"} Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.402381 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-hfsz5"] Dec 04 01:20:20 crc kubenswrapper[4764]: W1204 01:20:20.408785 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15380bf0_c6f1_47bd_9fe1_9062df838464.slice/crio-4050b910e7be34b389fd77a1fef77d08cb8f1a946e5de380c07a77ec99d786b9 WatchSource:0}: Error finding container 4050b910e7be34b389fd77a1fef77d08cb8f1a946e5de380c07a77ec99d786b9: Status 404 returned error can't find the container with id 4050b910e7be34b389fd77a1fef77d08cb8f1a946e5de380c07a77ec99d786b9 Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.613140 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-cms8z"] Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.616171 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.618192 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.618802 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.623165 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-cms8z"] Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.742526 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-combined-ca-bundle\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.742668 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-config-data\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.742890 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-scripts\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.742938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bc704469-633b-462f-8b15-974b0d822837-hm-ports\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.742965 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-amphora-certs\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.743008 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bc704469-633b-462f-8b15-974b0d822837-config-data-merged\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846111 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-scripts\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846231 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bc704469-633b-462f-8b15-974b0d822837-hm-ports\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846277 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-amphora-certs\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846340 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bc704469-633b-462f-8b15-974b0d822837-config-data-merged\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-combined-ca-bundle\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.846569 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-config-data\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.847509 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/bc704469-633b-462f-8b15-974b0d822837-hm-ports\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.848085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/bc704469-633b-462f-8b15-974b0d822837-config-data-merged\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.853254 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-combined-ca-bundle\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.853928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-amphora-certs\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.854185 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-config-data\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.855983 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc704469-633b-462f-8b15-974b0d822837-scripts\") pod \"octavia-worker-cms8z\" (UID: \"bc704469-633b-462f-8b15-974b0d822837\") " pod="openstack/octavia-worker-cms8z" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.869405 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.869908 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.915300 4764 generic.go:334] "Generic (PLEG): container finished" podID="fdd7c41d-eb6c-4fdc-8e37-a31282572e7d" containerID="666ef5ef9a9af1e16417a937607554f5f6710a692427599024b2d225f67de72e" exitCode=0 Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.915361 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-q9k9p" event={"ID":"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d","Type":"ContainerDied","Data":"666ef5ef9a9af1e16417a937607554f5f6710a692427599024b2d225f67de72e"} Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.917152 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-hfsz5" event={"ID":"15380bf0-c6f1-47bd-9fe1-9062df838464","Type":"ContainerStarted","Data":"4050b910e7be34b389fd77a1fef77d08cb8f1a946e5de380c07a77ec99d786b9"} Dec 04 01:20:20 crc kubenswrapper[4764]: I1204 01:20:20.948561 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-cms8z" Dec 04 01:20:21 crc kubenswrapper[4764]: I1204 01:20:21.512488 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-cms8z"] Dec 04 01:20:21 crc kubenswrapper[4764]: I1204 01:20:21.937053 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-q9k9p" event={"ID":"fdd7c41d-eb6c-4fdc-8e37-a31282572e7d","Type":"ContainerStarted","Data":"01f6cc868210d58e57ae4bded062b72744903232537cf9fb3c852bd83434a26c"} Dec 04 01:20:21 crc kubenswrapper[4764]: I1204 01:20:21.937353 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:21 crc kubenswrapper[4764]: I1204 01:20:21.939622 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-cms8z" event={"ID":"bc704469-633b-462f-8b15-974b0d822837","Type":"ContainerStarted","Data":"4a57b79325d74e141405e29f72e38dd4c62c7c65a560d8183b915e26e41c228a"} Dec 04 01:20:21 crc kubenswrapper[4764]: I1204 01:20:21.958561 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-q9k9p" podStartSLOduration=4.958541899 podStartE2EDuration="4.958541899s" podCreationTimestamp="2025-12-04 01:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:20:21.956240153 +0000 UTC m=+5957.717564584" watchObservedRunningTime="2025-12-04 01:20:21.958541899 +0000 UTC m=+5957.719866320" Dec 04 01:20:22 crc kubenswrapper[4764]: I1204 01:20:22.950524 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-hfsz5" event={"ID":"15380bf0-c6f1-47bd-9fe1-9062df838464","Type":"ContainerStarted","Data":"62a728496a067b6e2844bec45d54bb53bcda37f314224c883e6c30458430a31c"} Dec 04 01:20:23 crc kubenswrapper[4764]: I1204 01:20:23.963328 4764 generic.go:334] "Generic (PLEG): container finished" podID="15380bf0-c6f1-47bd-9fe1-9062df838464" containerID="62a728496a067b6e2844bec45d54bb53bcda37f314224c883e6c30458430a31c" exitCode=0 Dec 04 01:20:23 crc kubenswrapper[4764]: I1204 01:20:23.963419 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-hfsz5" event={"ID":"15380bf0-c6f1-47bd-9fe1-9062df838464","Type":"ContainerDied","Data":"62a728496a067b6e2844bec45d54bb53bcda37f314224c883e6c30458430a31c"} Dec 04 01:20:23 crc kubenswrapper[4764]: I1204 01:20:23.965612 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-cms8z" event={"ID":"bc704469-633b-462f-8b15-974b0d822837","Type":"ContainerStarted","Data":"e8ce50f0c99967afddd0c35dee62b81106cf94c72a5c37ebbb442b558697d3e4"} Dec 04 01:20:24 crc kubenswrapper[4764]: I1204 01:20:24.978907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-hfsz5" event={"ID":"15380bf0-c6f1-47bd-9fe1-9062df838464","Type":"ContainerStarted","Data":"fa82af01da38bbe7893cb11a92e5ecbb935b60d4a7718b4d6d3d3f64dacbd82e"} Dec 04 01:20:24 crc kubenswrapper[4764]: I1204 01:20:24.979617 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:24 crc kubenswrapper[4764]: I1204 01:20:24.981036 4764 generic.go:334] "Generic (PLEG): container finished" podID="bc704469-633b-462f-8b15-974b0d822837" containerID="e8ce50f0c99967afddd0c35dee62b81106cf94c72a5c37ebbb442b558697d3e4" exitCode=0 Dec 04 01:20:24 crc kubenswrapper[4764]: I1204 01:20:24.981075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-cms8z" event={"ID":"bc704469-633b-462f-8b15-974b0d822837","Type":"ContainerDied","Data":"e8ce50f0c99967afddd0c35dee62b81106cf94c72a5c37ebbb442b558697d3e4"} Dec 04 01:20:25 crc kubenswrapper[4764]: I1204 01:20:25.017816 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-hfsz5" podStartSLOduration=4.231806502 podStartE2EDuration="6.017787577s" podCreationTimestamp="2025-12-04 01:20:19 +0000 UTC" firstStartedPulling="2025-12-04 01:20:20.411761723 +0000 UTC m=+5956.173086134" lastFinishedPulling="2025-12-04 01:20:22.197742798 +0000 UTC m=+5957.959067209" observedRunningTime="2025-12-04 01:20:25.0093908 +0000 UTC m=+5960.770715211" watchObservedRunningTime="2025-12-04 01:20:25.017787577 +0000 UTC m=+5960.779112028" Dec 04 01:20:25 crc kubenswrapper[4764]: I1204 01:20:25.990847 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-cms8z" event={"ID":"bc704469-633b-462f-8b15-974b0d822837","Type":"ContainerStarted","Data":"914620390c654b8affdc5bf50c8115d832bbf7a261d714a51da1d9a81b97345a"} Dec 04 01:20:26 crc kubenswrapper[4764]: I1204 01:20:26.014521 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-cms8z" podStartSLOduration=4.186580296 podStartE2EDuration="6.014501573s" podCreationTimestamp="2025-12-04 01:20:20 +0000 UTC" firstStartedPulling="2025-12-04 01:20:21.521118772 +0000 UTC m=+5957.282443183" lastFinishedPulling="2025-12-04 01:20:23.349040039 +0000 UTC m=+5959.110364460" observedRunningTime="2025-12-04 01:20:26.006591278 +0000 UTC m=+5961.767915689" watchObservedRunningTime="2025-12-04 01:20:26.014501573 +0000 UTC m=+5961.775825984" Dec 04 01:20:26 crc kubenswrapper[4764]: I1204 01:20:26.999255 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-cms8z" Dec 04 01:20:32 crc kubenswrapper[4764]: I1204 01:20:32.921956 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-q9k9p" Dec 04 01:20:34 crc kubenswrapper[4764]: I1204 01:20:34.846534 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-hfsz5" Dec 04 01:20:35 crc kubenswrapper[4764]: I1204 01:20:35.982027 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-cms8z" Dec 04 01:20:50 crc kubenswrapper[4764]: I1204 01:20:50.869587 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:20:50 crc kubenswrapper[4764]: I1204 01:20:50.870351 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:21:11 crc kubenswrapper[4764]: I1204 01:21:11.062237 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ae2c-account-create-update-8zmdj"] Dec 04 01:21:11 crc kubenswrapper[4764]: I1204 01:21:11.076971 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f5f65"] Dec 04 01:21:11 crc kubenswrapper[4764]: I1204 01:21:11.088978 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ae2c-account-create-update-8zmdj"] Dec 04 01:21:11 crc kubenswrapper[4764]: I1204 01:21:11.099328 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f5f65"] Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.563820 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56aaaa63-fc9e-4be4-b59e-94a58c3d871b" path="/var/lib/kubelet/pods/56aaaa63-fc9e-4be4-b59e-94a58c3d871b/volumes" Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.565262 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700c9327-890d-4ea4-bb92-c9542c0de314" path="/var/lib/kubelet/pods/700c9327-890d-4ea4-bb92-c9542c0de314/volumes" Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.636838 4764 scope.go:117] "RemoveContainer" containerID="3ca723cd8246c66c3f9f13e12aa5a610c17ada4d61700922ff2edb7f2e03c16b" Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.666943 4764 scope.go:117] "RemoveContainer" containerID="edd93145d1bedd3c82c160d5efebed40d403e24ca31b5af3b034c6c490a32b42" Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.741800 4764 scope.go:117] "RemoveContainer" containerID="d55e11f8bf69f614d485584fc82152577855c4a5c9e99880a53a8e1e7b21b863" Dec 04 01:21:12 crc kubenswrapper[4764]: I1204 01:21:12.766001 4764 scope.go:117] "RemoveContainer" containerID="d78fcf491b719445bce8b67ea2ccd011109a985b5ce6b037d8c75075bfd0aff6" Dec 04 01:21:17 crc kubenswrapper[4764]: I1204 01:21:17.043969 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-5gq6j"] Dec 04 01:21:17 crc kubenswrapper[4764]: I1204 01:21:17.062663 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-5gq6j"] Dec 04 01:21:18 crc kubenswrapper[4764]: I1204 01:21:18.559458 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44374b57-55ca-4eb6-acd9-34eea12e4f86" path="/var/lib/kubelet/pods/44374b57-55ca-4eb6-acd9-34eea12e4f86/volumes" Dec 04 01:21:20 crc kubenswrapper[4764]: I1204 01:21:20.869071 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:21:20 crc kubenswrapper[4764]: I1204 01:21:20.869794 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:21:20 crc kubenswrapper[4764]: I1204 01:21:20.869856 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:21:20 crc kubenswrapper[4764]: I1204 01:21:20.870544 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:21:20 crc kubenswrapper[4764]: I1204 01:21:20.870625 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" gracePeriod=600 Dec 04 01:21:21 crc kubenswrapper[4764]: E1204 01:21:21.013094 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:21:21 crc kubenswrapper[4764]: I1204 01:21:21.701828 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" exitCode=0 Dec 04 01:21:21 crc kubenswrapper[4764]: I1204 01:21:21.701993 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3"} Dec 04 01:21:21 crc kubenswrapper[4764]: I1204 01:21:21.702302 4764 scope.go:117] "RemoveContainer" containerID="0afac051ad12f65e3908eb7c494c1d5ea59de94feec2898e7b80d8b1b5968274" Dec 04 01:21:21 crc kubenswrapper[4764]: I1204 01:21:21.703482 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:21:21 crc kubenswrapper[4764]: E1204 01:21:21.704201 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.721857 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.725480 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.728673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.728981 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-b5xr7" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.729147 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.732828 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.741431 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.798333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.798441 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.798555 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.798620 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj7b5\" (UniqueName: \"kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.798688 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.817789 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.818063 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-log" containerID="cri-o://02caf2897aeae1c89816e55fc6b0cf6dc8791167a7e894adf8747732e4815bec" gracePeriod=30 Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.818499 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-httpd" containerID="cri-o://084829b6a456cb2942d5b33745e4a36ecc94629b56ddfd24cbd5605ca415e25f" gracePeriod=30 Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.841567 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.844609 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.848416 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.865409 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.865900 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-httpd" containerID="cri-o://aa91d592b342f511954169a3a8530652656813d36b12423829a4c4066288ce44" gracePeriod=30 Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.866078 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-log" containerID="cri-o://65b03840f2954627f6af8c2beb5d803307b845e7120689e81e966028c4620286" gracePeriod=30 Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.900236 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.900360 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.900458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.900518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj7b5\" (UniqueName: \"kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.900583 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.901242 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.902092 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.902133 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.910824 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:26 crc kubenswrapper[4764]: I1204 01:21:26.918793 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj7b5\" (UniqueName: \"kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5\") pod \"horizon-74bff6c665-d8pvt\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.002274 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.002332 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.002636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.002875 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.002935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5nfp\" (UniqueName: \"kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.047418 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.105311 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.105689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.105736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5nfp\" (UniqueName: \"kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.105811 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.106235 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.106762 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.108655 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.108796 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.113857 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.135696 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5nfp\" (UniqueName: \"kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp\") pod \"horizon-6854fc5bc7-sj99l\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.171574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.349241 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.384504 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.389390 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.395246 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.518115 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.518165 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.518224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrsc\" (UniqueName: \"kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.518272 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.518319 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.623812 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.623868 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.623916 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrsc\" (UniqueName: \"kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.623970 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.624026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.625206 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.626329 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.628743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.637445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.650445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrsc\" (UniqueName: \"kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc\") pod \"horizon-6c6459d4df-dh2rf\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.720032 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.732324 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.758548 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.765507 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.787356 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerStarted","Data":"0a684e37416307cfcd35155255e12895dde212128068bac09f58969a35e8ebd0"} Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.789608 4764 generic.go:334] "Generic (PLEG): container finished" podID="e42df602-041a-4cce-9576-fa19d5cb9750" containerID="02caf2897aeae1c89816e55fc6b0cf6dc8791167a7e894adf8747732e4815bec" exitCode=143 Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.789656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerDied","Data":"02caf2897aeae1c89816e55fc6b0cf6dc8791167a7e894adf8747732e4815bec"} Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.791149 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerStarted","Data":"470593dec27f69478a6bab19dc6ccb78970a22066e027966a1020bdd7d41cd06"} Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.793014 4764 generic.go:334] "Generic (PLEG): container finished" podID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerID="65b03840f2954627f6af8c2beb5d803307b845e7120689e81e966028c4620286" exitCode=143 Dec 04 01:21:27 crc kubenswrapper[4764]: I1204 01:21:27.793038 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerDied","Data":"65b03840f2954627f6af8c2beb5d803307b845e7120689e81e966028c4620286"} Dec 04 01:21:28 crc kubenswrapper[4764]: I1204 01:21:28.202992 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:21:28 crc kubenswrapper[4764]: I1204 01:21:28.804766 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerStarted","Data":"25a06564c9e8fffe545625166e0321b47f929d2c63b760414e3b2cf90293a5ff"} Dec 04 01:21:30 crc kubenswrapper[4764]: I1204 01:21:30.830636 4764 generic.go:334] "Generic (PLEG): container finished" podID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerID="aa91d592b342f511954169a3a8530652656813d36b12423829a4c4066288ce44" exitCode=0 Dec 04 01:21:30 crc kubenswrapper[4764]: I1204 01:21:30.830748 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerDied","Data":"aa91d592b342f511954169a3a8530652656813d36b12423829a4c4066288ce44"} Dec 04 01:21:30 crc kubenswrapper[4764]: I1204 01:21:30.833601 4764 generic.go:334] "Generic (PLEG): container finished" podID="e42df602-041a-4cce-9576-fa19d5cb9750" containerID="084829b6a456cb2942d5b33745e4a36ecc94629b56ddfd24cbd5605ca415e25f" exitCode=0 Dec 04 01:21:30 crc kubenswrapper[4764]: I1204 01:21:30.833644 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerDied","Data":"084829b6a456cb2942d5b33745e4a36ecc94629b56ddfd24cbd5605ca415e25f"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.028182 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.055162 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130752 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130776 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130816 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrf9q\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130845 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130894 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130933 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.130979 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.131012 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z98w\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.131032 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.131071 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.131118 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs\") pod \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\" (UID: \"5509e2e8-e0a9-48cd-9afa-17dfa3b38369\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.131157 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts\") pod \"e42df602-041a-4cce-9576-fa19d5cb9750\" (UID: \"e42df602-041a-4cce-9576-fa19d5cb9750\") " Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.134646 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs" (OuterVolumeSpecName: "logs") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.137122 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts" (OuterVolumeSpecName: "scripts") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.140208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.140329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q" (OuterVolumeSpecName: "kube-api-access-lrf9q") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "kube-api-access-lrf9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.140588 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.141556 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs" (OuterVolumeSpecName: "logs") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.144779 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w" (OuterVolumeSpecName: "kube-api-access-7z98w") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "kube-api-access-7z98w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.145430 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts" (OuterVolumeSpecName: "scripts") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.146688 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph" (OuterVolumeSpecName: "ceph") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.148014 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph" (OuterVolumeSpecName: "ceph") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.186119 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.206085 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.240864 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data" (OuterVolumeSpecName: "config-data") pod "5509e2e8-e0a9-48cd-9afa-17dfa3b38369" (UID: "5509e2e8-e0a9-48cd-9afa-17dfa3b38369"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250474 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250515 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250531 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z98w\" (UniqueName: \"kubernetes.io/projected/e42df602-041a-4cce-9576-fa19d5cb9750-kube-api-access-7z98w\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250548 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250564 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250591 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250604 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250616 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250627 4764 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250638 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250650 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrf9q\" (UniqueName: \"kubernetes.io/projected/5509e2e8-e0a9-48cd-9afa-17dfa3b38369-kube-api-access-lrf9q\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250663 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.250673 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e42df602-041a-4cce-9576-fa19d5cb9750-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.252499 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data" (OuterVolumeSpecName: "config-data") pod "e42df602-041a-4cce-9576-fa19d5cb9750" (UID: "e42df602-041a-4cce-9576-fa19d5cb9750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.353469 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42df602-041a-4cce-9576-fa19d5cb9750-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.886643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerStarted","Data":"a66aac1cca762730569a1974f2c207b4d6a6d993651e0c476085bb48fcf43e8c"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.887005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerStarted","Data":"dc5810ef8ab98b3a7cfbd3a5acc09babca231c3f501293fe54837901e49b8e4d"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.886815 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6854fc5bc7-sj99l" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon" containerID="cri-o://a66aac1cca762730569a1974f2c207b4d6a6d993651e0c476085bb48fcf43e8c" gracePeriod=30 Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.886785 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6854fc5bc7-sj99l" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon-log" containerID="cri-o://dc5810ef8ab98b3a7cfbd3a5acc09babca231c3f501293fe54837901e49b8e4d" gracePeriod=30 Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.889779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerStarted","Data":"612cd3f07a344a3ecee7a42c0583f40ad79557e38a3b1edd84984fd07aba56dc"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.889837 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerStarted","Data":"1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.892175 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e42df602-041a-4cce-9576-fa19d5cb9750","Type":"ContainerDied","Data":"80a395fec67ba56e529f505f520f0208ede1113a1bc48e4a842ff8967274ccec"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.892220 4764 scope.go:117] "RemoveContainer" containerID="084829b6a456cb2942d5b33745e4a36ecc94629b56ddfd24cbd5605ca415e25f" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.892390 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.900028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerStarted","Data":"cde867efa9ed1ef62e54ce1b45c11b1742c5a9d91626b5e7f853ee44e29d6bf7"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.900066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerStarted","Data":"98673d74fc079017d323e98e905ee9976be556fd095905e6bb3f96d3b4b6f936"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.909816 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5509e2e8-e0a9-48cd-9afa-17dfa3b38369","Type":"ContainerDied","Data":"b3c2f52117edee2edf365741258497553bd92386653c19e29301c85db79d4bfc"} Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.909919 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.921551 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6854fc5bc7-sj99l" podStartSLOduration=2.64066317 podStartE2EDuration="9.921530888s" podCreationTimestamp="2025-12-04 01:21:26 +0000 UTC" firstStartedPulling="2025-12-04 01:21:27.758259709 +0000 UTC m=+6023.519584120" lastFinishedPulling="2025-12-04 01:21:35.039127417 +0000 UTC m=+6030.800451838" observedRunningTime="2025-12-04 01:21:35.918073623 +0000 UTC m=+6031.679398044" watchObservedRunningTime="2025-12-04 01:21:35.921530888 +0000 UTC m=+6031.682855289" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.942839 4764 scope.go:117] "RemoveContainer" containerID="02caf2897aeae1c89816e55fc6b0cf6dc8791167a7e894adf8747732e4815bec" Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.958963 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:35 crc kubenswrapper[4764]: I1204 01:21:35.977246 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.011372 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: E1204 01:21:36.011934 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.011956 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: E1204 01:21:36.012007 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012017 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: E1204 01:21:36.012048 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012060 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: E1204 01:21:36.012081 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012090 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012327 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012355 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012370 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-log" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.012388 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" containerName="glance-httpd" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.013831 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.020609 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.020737 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.020807 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-4bq64" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.047800 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74bff6c665-d8pvt" podStartSLOduration=2.741900471 podStartE2EDuration="10.047772525s" podCreationTimestamp="2025-12-04 01:21:26 +0000 UTC" firstStartedPulling="2025-12-04 01:21:27.773054233 +0000 UTC m=+6023.534378644" lastFinishedPulling="2025-12-04 01:21:35.078926277 +0000 UTC m=+6030.840250698" observedRunningTime="2025-12-04 01:21:35.985532803 +0000 UTC m=+6031.746857214" watchObservedRunningTime="2025-12-04 01:21:36.047772525 +0000 UTC m=+6031.809096946" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.061734 4764 scope.go:117] "RemoveContainer" containerID="aa91d592b342f511954169a3a8530652656813d36b12423829a4c4066288ce44" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.070935 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.071862 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.071935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-logs\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.071955 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-ceph\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.071996 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mc7l\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-kube-api-access-2mc7l\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.072063 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.072103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.072128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.085461 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c6459d4df-dh2rf" podStartSLOduration=2.262182489 podStartE2EDuration="9.085438643s" podCreationTimestamp="2025-12-04 01:21:27 +0000 UTC" firstStartedPulling="2025-12-04 01:21:28.19218057 +0000 UTC m=+6023.953505021" lastFinishedPulling="2025-12-04 01:21:35.015436764 +0000 UTC m=+6030.776761175" observedRunningTime="2025-12-04 01:21:36.009632187 +0000 UTC m=+6031.770956608" watchObservedRunningTime="2025-12-04 01:21:36.085438643 +0000 UTC m=+6031.846763054" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.100098 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.110502 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.122058 4764 scope.go:117] "RemoveContainer" containerID="65b03840f2954627f6af8c2beb5d803307b845e7120689e81e966028c4620286" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.127911 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.132099 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.134577 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.161939 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173605 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173628 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-kube-api-access-vz9r7\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173657 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-logs\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173925 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-ceph\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.173994 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-logs\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.174043 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mc7l\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-kube-api-access-2mc7l\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.174090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.174464 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.177644 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f231d621-5c72-4559-aad3-392c8bcba6e1-logs\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.180493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-config-data\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.182999 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.190025 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-ceph\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.197962 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f231d621-5c72-4559-aad3-392c8bcba6e1-scripts\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.203425 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mc7l\" (UniqueName: \"kubernetes.io/projected/f231d621-5c72-4559-aad3-392c8bcba6e1-kube-api-access-2mc7l\") pod \"glance-default-external-api-0\" (UID: \"f231d621-5c72-4559-aad3-392c8bcba6e1\") " pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.275524 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276036 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276080 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-logs\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276175 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276235 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-kube-api-access-vz9r7\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.276910 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-logs\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.280679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ad61f840-5767-444f-9fe8-12d36e3d1582-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.281995 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-ceph\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.282138 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.284087 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.287898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad61f840-5767-444f-9fe8-12d36e3d1582-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.350679 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/ad61f840-5767-444f-9fe8-12d36e3d1582-kube-api-access-vz9r7\") pod \"glance-default-internal-api-0\" (UID: \"ad61f840-5767-444f-9fe8-12d36e3d1582\") " pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.365593 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.453448 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.554820 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:21:36 crc kubenswrapper[4764]: E1204 01:21:36.555452 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.566592 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5509e2e8-e0a9-48cd-9afa-17dfa3b38369" path="/var/lib/kubelet/pods/5509e2e8-e0a9-48cd-9afa-17dfa3b38369/volumes" Dec 04 01:21:36 crc kubenswrapper[4764]: I1204 01:21:36.567898 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42df602-041a-4cce-9576-fa19d5cb9750" path="/var/lib/kubelet/pods/e42df602-041a-4cce-9576-fa19d5cb9750/volumes" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.047755 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.048107 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.172361 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.205682 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.721382 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.721426 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:21:37 crc kubenswrapper[4764]: I1204 01:21:37.935384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ad61f840-5767-444f-9fe8-12d36e3d1582","Type":"ContainerStarted","Data":"7b9019605036a8239a444e81e51af6903482b699e67d4837ae61d92e922a00f1"} Dec 04 01:21:38 crc kubenswrapper[4764]: I1204 01:21:38.159483 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 01:21:38 crc kubenswrapper[4764]: W1204 01:21:38.167072 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf231d621_5c72_4559_aad3_392c8bcba6e1.slice/crio-998380c1fb625950a6e7274b7563642a396cbf7026b153b9a9bb462039d5dfd5 WatchSource:0}: Error finding container 998380c1fb625950a6e7274b7563642a396cbf7026b153b9a9bb462039d5dfd5: Status 404 returned error can't find the container with id 998380c1fb625950a6e7274b7563642a396cbf7026b153b9a9bb462039d5dfd5 Dec 04 01:21:38 crc kubenswrapper[4764]: I1204 01:21:38.965984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ad61f840-5767-444f-9fe8-12d36e3d1582","Type":"ContainerStarted","Data":"a335f0199de030b371bf868e97fc9837b14210be18936dc7a78d610ec111ba00"} Dec 04 01:21:38 crc kubenswrapper[4764]: I1204 01:21:38.966700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ad61f840-5767-444f-9fe8-12d36e3d1582","Type":"ContainerStarted","Data":"3a319a7806e8dc8fb1dd5113694d7168dfd466eb21d17def9350b2ab49e8e44f"} Dec 04 01:21:38 crc kubenswrapper[4764]: I1204 01:21:38.978566 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f231d621-5c72-4559-aad3-392c8bcba6e1","Type":"ContainerStarted","Data":"d7f054447c13d50db78ebca23b6e61e383e1cfb9f4c6427ee0905cc31253b179"} Dec 04 01:21:38 crc kubenswrapper[4764]: I1204 01:21:38.978610 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f231d621-5c72-4559-aad3-392c8bcba6e1","Type":"ContainerStarted","Data":"998380c1fb625950a6e7274b7563642a396cbf7026b153b9a9bb462039d5dfd5"} Dec 04 01:21:40 crc kubenswrapper[4764]: I1204 01:21:40.019977 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f231d621-5c72-4559-aad3-392c8bcba6e1","Type":"ContainerStarted","Data":"ba9caecefae0116bc6116b22dff27eee6c9aedb22f7e91da6e0bd8558a89cf90"} Dec 04 01:21:40 crc kubenswrapper[4764]: I1204 01:21:40.040280 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.040255106 podStartE2EDuration="4.040255106s" podCreationTimestamp="2025-12-04 01:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:21:38.999306892 +0000 UTC m=+6034.760631303" watchObservedRunningTime="2025-12-04 01:21:40.040255106 +0000 UTC m=+6035.801579537" Dec 04 01:21:45 crc kubenswrapper[4764]: I1204 01:21:45.094955 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.094929244 podStartE2EDuration="10.094929244s" podCreationTimestamp="2025-12-04 01:21:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:21:40.041433535 +0000 UTC m=+6035.802757946" watchObservedRunningTime="2025-12-04 01:21:45.094929244 +0000 UTC m=+6040.856253675" Dec 04 01:21:45 crc kubenswrapper[4764]: I1204 01:21:45.108840 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5469-account-create-update-9w7d2"] Dec 04 01:21:45 crc kubenswrapper[4764]: I1204 01:21:45.120811 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-t6zlw"] Dec 04 01:21:45 crc kubenswrapper[4764]: I1204 01:21:45.132680 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5469-account-create-update-9w7d2"] Dec 04 01:21:45 crc kubenswrapper[4764]: I1204 01:21:45.146942 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-t6zlw"] Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.366742 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.367073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.420826 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.429724 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.454252 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.454288 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.502946 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.507302 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.561278 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="530b9e0b-90f2-4b27-9109-d59dcdfd8c71" path="/var/lib/kubelet/pods/530b9e0b-90f2-4b27-9109-d59dcdfd8c71/volumes" Dec 04 01:21:46 crc kubenswrapper[4764]: I1204 01:21:46.561905 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8422e89a-26b9-4543-b999-720bf0cf7224" path="/var/lib/kubelet/pods/8422e89a-26b9-4543-b999-720bf0cf7224/volumes" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.049447 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.105671 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.105786 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.105817 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.107094 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:47 crc kubenswrapper[4764]: I1204 01:21:47.723119 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.124581 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.126392 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.124622 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.126470 4764 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.335360 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.336376 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.411306 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 01:21:49 crc kubenswrapper[4764]: I1204 01:21:49.494235 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 01:21:50 crc kubenswrapper[4764]: I1204 01:21:50.545928 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:21:50 crc kubenswrapper[4764]: E1204 01:21:50.546462 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:21:54 crc kubenswrapper[4764]: I1204 01:21:54.033262 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fvv6g"] Dec 04 01:21:54 crc kubenswrapper[4764]: I1204 01:21:54.042758 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fvv6g"] Dec 04 01:21:54 crc kubenswrapper[4764]: I1204 01:21:54.558965 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f519b7de-08ef-488a-a725-f7a79ec23e1f" path="/var/lib/kubelet/pods/f519b7de-08ef-488a-a725-f7a79ec23e1f/volumes" Dec 04 01:21:58 crc kubenswrapper[4764]: I1204 01:21:58.729529 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:21:59 crc kubenswrapper[4764]: I1204 01:21:59.417154 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:22:00 crc kubenswrapper[4764]: I1204 01:22:00.388239 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:22:01 crc kubenswrapper[4764]: I1204 01:22:01.049850 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:22:01 crc kubenswrapper[4764]: I1204 01:22:01.137785 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:22:01 crc kubenswrapper[4764]: I1204 01:22:01.272026 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" containerID="cri-o://cde867efa9ed1ef62e54ce1b45c11b1742c5a9d91626b5e7f853ee44e29d6bf7" gracePeriod=30 Dec 04 01:22:01 crc kubenswrapper[4764]: I1204 01:22:01.272204 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon-log" containerID="cri-o://98673d74fc079017d323e98e905ee9976be556fd095905e6bb3f96d3b4b6f936" gracePeriod=30 Dec 04 01:22:02 crc kubenswrapper[4764]: I1204 01:22:02.545828 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:22:02 crc kubenswrapper[4764]: E1204 01:22:02.546184 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:22:05 crc kubenswrapper[4764]: I1204 01:22:05.327179 4764 generic.go:334] "Generic (PLEG): container finished" podID="7586e32c-be69-45bb-a636-89f5a6d55502" containerID="cde867efa9ed1ef62e54ce1b45c11b1742c5a9d91626b5e7f853ee44e29d6bf7" exitCode=0 Dec 04 01:22:05 crc kubenswrapper[4764]: I1204 01:22:05.327290 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerDied","Data":"cde867efa9ed1ef62e54ce1b45c11b1742c5a9d91626b5e7f853ee44e29d6bf7"} Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339043 4764 generic.go:334] "Generic (PLEG): container finished" podID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerID="a66aac1cca762730569a1974f2c207b4d6a6d993651e0c476085bb48fcf43e8c" exitCode=137 Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339452 4764 generic.go:334] "Generic (PLEG): container finished" podID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerID="dc5810ef8ab98b3a7cfbd3a5acc09babca231c3f501293fe54837901e49b8e4d" exitCode=137 Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339156 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerDied","Data":"a66aac1cca762730569a1974f2c207b4d6a6d993651e0c476085bb48fcf43e8c"} Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339515 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerDied","Data":"dc5810ef8ab98b3a7cfbd3a5acc09babca231c3f501293fe54837901e49b8e4d"} Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339544 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6854fc5bc7-sj99l" event={"ID":"247b2aef-77fe-4708-92b4-11255e8f2c01","Type":"ContainerDied","Data":"0a684e37416307cfcd35155255e12895dde212128068bac09f58969a35e8ebd0"} Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.339565 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a684e37416307cfcd35155255e12895dde212128068bac09f58969a35e8ebd0" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.382218 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.455374 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key\") pod \"247b2aef-77fe-4708-92b4-11255e8f2c01\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.455549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5nfp\" (UniqueName: \"kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp\") pod \"247b2aef-77fe-4708-92b4-11255e8f2c01\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.455705 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts\") pod \"247b2aef-77fe-4708-92b4-11255e8f2c01\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.456113 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs\") pod \"247b2aef-77fe-4708-92b4-11255e8f2c01\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.456171 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data\") pod \"247b2aef-77fe-4708-92b4-11255e8f2c01\" (UID: \"247b2aef-77fe-4708-92b4-11255e8f2c01\") " Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.456703 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs" (OuterVolumeSpecName: "logs") pod "247b2aef-77fe-4708-92b4-11255e8f2c01" (UID: "247b2aef-77fe-4708-92b4-11255e8f2c01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.457170 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/247b2aef-77fe-4708-92b4-11255e8f2c01-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.461169 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "247b2aef-77fe-4708-92b4-11255e8f2c01" (UID: "247b2aef-77fe-4708-92b4-11255e8f2c01"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.461218 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp" (OuterVolumeSpecName: "kube-api-access-r5nfp") pod "247b2aef-77fe-4708-92b4-11255e8f2c01" (UID: "247b2aef-77fe-4708-92b4-11255e8f2c01"). InnerVolumeSpecName "kube-api-access-r5nfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.480306 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts" (OuterVolumeSpecName: "scripts") pod "247b2aef-77fe-4708-92b4-11255e8f2c01" (UID: "247b2aef-77fe-4708-92b4-11255e8f2c01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.490497 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data" (OuterVolumeSpecName: "config-data") pod "247b2aef-77fe-4708-92b4-11255e8f2c01" (UID: "247b2aef-77fe-4708-92b4-11255e8f2c01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.559128 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.559152 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/247b2aef-77fe-4708-92b4-11255e8f2c01-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.559165 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5nfp\" (UniqueName: \"kubernetes.io/projected/247b2aef-77fe-4708-92b4-11255e8f2c01-kube-api-access-r5nfp\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:06 crc kubenswrapper[4764]: I1204 01:22:06.559190 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/247b2aef-77fe-4708-92b4-11255e8f2c01-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:07 crc kubenswrapper[4764]: I1204 01:22:07.048234 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 04 01:22:07 crc kubenswrapper[4764]: I1204 01:22:07.347123 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6854fc5bc7-sj99l" Dec 04 01:22:07 crc kubenswrapper[4764]: I1204 01:22:07.375219 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:22:07 crc kubenswrapper[4764]: I1204 01:22:07.383633 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6854fc5bc7-sj99l"] Dec 04 01:22:08 crc kubenswrapper[4764]: I1204 01:22:08.558672 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" path="/var/lib/kubelet/pods/247b2aef-77fe-4708-92b4-11255e8f2c01/volumes" Dec 04 01:22:12 crc kubenswrapper[4764]: I1204 01:22:12.894915 4764 scope.go:117] "RemoveContainer" containerID="d9b7406e5ae6b6e6827d3e42930f6d230ba4dff071daed3c932b57616bc01164" Dec 04 01:22:12 crc kubenswrapper[4764]: I1204 01:22:12.941359 4764 scope.go:117] "RemoveContainer" containerID="23107788f3ccdd79607654beba94204f0532168a837c22039178059e2326a4f2" Dec 04 01:22:12 crc kubenswrapper[4764]: I1204 01:22:12.982208 4764 scope.go:117] "RemoveContainer" containerID="85581f794732e9d01529d6ac7396d77e57128113f5fe3cacd69c89ccc8387790" Dec 04 01:22:13 crc kubenswrapper[4764]: I1204 01:22:13.030109 4764 scope.go:117] "RemoveContainer" containerID="7990d7e5236a93ad7e3b6d29a09ad4f537aef2909ff04feec0063b9903497573" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.546423 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:22:14 crc kubenswrapper[4764]: E1204 01:22:14.547210 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.627428 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67554b9ccc-vdgrl"] Dec 04 01:22:14 crc kubenswrapper[4764]: E1204 01:22:14.627922 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.627944 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon" Dec 04 01:22:14 crc kubenswrapper[4764]: E1204 01:22:14.627994 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon-log" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.628004 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon-log" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.628263 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.628286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="247b2aef-77fe-4708-92b4-11255e8f2c01" containerName="horizon-log" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.633107 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.641400 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67554b9ccc-vdgrl"] Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.765023 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-scripts\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.765117 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9560a14-b532-45fb-943d-20a22e210b3f-logs\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.765468 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9zrm\" (UniqueName: \"kubernetes.io/projected/c9560a14-b532-45fb-943d-20a22e210b3f-kube-api-access-r9zrm\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.765843 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-config-data\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.765989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9560a14-b532-45fb-943d-20a22e210b3f-horizon-secret-key\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.868371 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-scripts\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.868675 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9560a14-b532-45fb-943d-20a22e210b3f-logs\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.868701 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9zrm\" (UniqueName: \"kubernetes.io/projected/c9560a14-b532-45fb-943d-20a22e210b3f-kube-api-access-r9zrm\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.869113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9560a14-b532-45fb-943d-20a22e210b3f-logs\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.869254 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-config-data\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.869309 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-scripts\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.870566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c9560a14-b532-45fb-943d-20a22e210b3f-config-data\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.870693 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9560a14-b532-45fb-943d-20a22e210b3f-horizon-secret-key\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.884036 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c9560a14-b532-45fb-943d-20a22e210b3f-horizon-secret-key\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.887334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9zrm\" (UniqueName: \"kubernetes.io/projected/c9560a14-b532-45fb-943d-20a22e210b3f-kube-api-access-r9zrm\") pod \"horizon-67554b9ccc-vdgrl\" (UID: \"c9560a14-b532-45fb-943d-20a22e210b3f\") " pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:14 crc kubenswrapper[4764]: I1204 01:22:14.964037 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.455181 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67554b9ccc-vdgrl"] Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.752274 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-mzhfh"] Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.754002 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.770055 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mzhfh"] Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.863329 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-b5cc-account-create-update-4nx8x"] Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.864669 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.866753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.873336 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b5cc-account-create-update-4nx8x"] Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.897359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfsg8\" (UniqueName: \"kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:15 crc kubenswrapper[4764]: I1204 01:22:15.897523 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:15.999665 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfsg8\" (UniqueName: \"kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:15.999756 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7r5\" (UniqueName: \"kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:15.999834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:15.999860 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.000640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.020169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfsg8\" (UniqueName: \"kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8\") pod \"heat-db-create-mzhfh\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.080928 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.102264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7r5\" (UniqueName: \"kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.102591 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.103319 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.124414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7r5\" (UniqueName: \"kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5\") pod \"heat-b5cc-account-create-update-4nx8x\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.186214 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.525573 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67554b9ccc-vdgrl" event={"ID":"c9560a14-b532-45fb-943d-20a22e210b3f","Type":"ContainerStarted","Data":"c9c4bf00202c138c1ce7e5b681c3901c8334716012a34e296d908d7a8b5e8008"} Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.525983 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67554b9ccc-vdgrl" event={"ID":"c9560a14-b532-45fb-943d-20a22e210b3f","Type":"ContainerStarted","Data":"7d1a3080a5b8995206a3889124068c9fb61b19dd679b5a9102dbb9e7a60bbf03"} Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.526022 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67554b9ccc-vdgrl" event={"ID":"c9560a14-b532-45fb-943d-20a22e210b3f","Type":"ContainerStarted","Data":"21e0ce9d23d7dfd33a1482b868ad43757d45904d5528ce5cb0c8a16ec89fe977"} Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.563360 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67554b9ccc-vdgrl" podStartSLOduration=2.563325221 podStartE2EDuration="2.563325221s" podCreationTimestamp="2025-12-04 01:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:22:16.557653041 +0000 UTC m=+6072.318977452" watchObservedRunningTime="2025-12-04 01:22:16.563325221 +0000 UTC m=+6072.324649632" Dec 04 01:22:16 crc kubenswrapper[4764]: W1204 01:22:16.573835 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod322bc7d6_4e25_4535_81ba_59aff3f7331a.slice/crio-17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5 WatchSource:0}: Error finding container 17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5: Status 404 returned error can't find the container with id 17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5 Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.575214 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mzhfh"] Dec 04 01:22:16 crc kubenswrapper[4764]: W1204 01:22:16.784189 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7e7fb83_18af_40c9_907f_284ed3a95843.slice/crio-27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df WatchSource:0}: Error finding container 27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df: Status 404 returned error can't find the container with id 27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df Dec 04 01:22:16 crc kubenswrapper[4764]: I1204 01:22:16.790093 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b5cc-account-create-update-4nx8x"] Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.048265 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.534509 4764 generic.go:334] "Generic (PLEG): container finished" podID="322bc7d6-4e25-4535-81ba-59aff3f7331a" containerID="b8452a0ea41802f202e27b2709c0c0c312581e4db07df2cf52cab1d18b11bba5" exitCode=0 Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.534794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzhfh" event={"ID":"322bc7d6-4e25-4535-81ba-59aff3f7331a","Type":"ContainerDied","Data":"b8452a0ea41802f202e27b2709c0c0c312581e4db07df2cf52cab1d18b11bba5"} Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.534875 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzhfh" event={"ID":"322bc7d6-4e25-4535-81ba-59aff3f7331a","Type":"ContainerStarted","Data":"17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5"} Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.537750 4764 generic.go:334] "Generic (PLEG): container finished" podID="b7e7fb83-18af-40c9-907f-284ed3a95843" containerID="bb436f9fbb0ed12578ee66f971d6b39b3612e70cdff71fd02e1d05acd38bd47b" exitCode=0 Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.538480 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b5cc-account-create-update-4nx8x" event={"ID":"b7e7fb83-18af-40c9-907f-284ed3a95843","Type":"ContainerDied","Data":"bb436f9fbb0ed12578ee66f971d6b39b3612e70cdff71fd02e1d05acd38bd47b"} Dec 04 01:22:17 crc kubenswrapper[4764]: I1204 01:22:17.538590 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b5cc-account-create-update-4nx8x" event={"ID":"b7e7fb83-18af-40c9-907f-284ed3a95843","Type":"ContainerStarted","Data":"27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df"} Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.034169 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.043876 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.177649 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts\") pod \"b7e7fb83-18af-40c9-907f-284ed3a95843\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.177873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfsg8\" (UniqueName: \"kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8\") pod \"322bc7d6-4e25-4535-81ba-59aff3f7331a\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.177893 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts\") pod \"322bc7d6-4e25-4535-81ba-59aff3f7331a\" (UID: \"322bc7d6-4e25-4535-81ba-59aff3f7331a\") " Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.177988 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv7r5\" (UniqueName: \"kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5\") pod \"b7e7fb83-18af-40c9-907f-284ed3a95843\" (UID: \"b7e7fb83-18af-40c9-907f-284ed3a95843\") " Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.178546 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7e7fb83-18af-40c9-907f-284ed3a95843" (UID: "b7e7fb83-18af-40c9-907f-284ed3a95843"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.179267 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "322bc7d6-4e25-4535-81ba-59aff3f7331a" (UID: "322bc7d6-4e25-4535-81ba-59aff3f7331a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.184082 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8" (OuterVolumeSpecName: "kube-api-access-pfsg8") pod "322bc7d6-4e25-4535-81ba-59aff3f7331a" (UID: "322bc7d6-4e25-4535-81ba-59aff3f7331a"). InnerVolumeSpecName "kube-api-access-pfsg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.200220 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5" (OuterVolumeSpecName: "kube-api-access-gv7r5") pod "b7e7fb83-18af-40c9-907f-284ed3a95843" (UID: "b7e7fb83-18af-40c9-907f-284ed3a95843"). InnerVolumeSpecName "kube-api-access-gv7r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.279953 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfsg8\" (UniqueName: \"kubernetes.io/projected/322bc7d6-4e25-4535-81ba-59aff3f7331a-kube-api-access-pfsg8\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.280285 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/322bc7d6-4e25-4535-81ba-59aff3f7331a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.280297 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv7r5\" (UniqueName: \"kubernetes.io/projected/b7e7fb83-18af-40c9-907f-284ed3a95843-kube-api-access-gv7r5\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.280306 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7e7fb83-18af-40c9-907f-284ed3a95843-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.566603 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b5cc-account-create-update-4nx8x" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.566609 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b5cc-account-create-update-4nx8x" event={"ID":"b7e7fb83-18af-40c9-907f-284ed3a95843","Type":"ContainerDied","Data":"27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df"} Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.566675 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f42296c00bac7899c7c4ed829e7697448d23cec043186f812651106ef125df" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.568780 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzhfh" event={"ID":"322bc7d6-4e25-4535-81ba-59aff3f7331a","Type":"ContainerDied","Data":"17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5"} Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.568818 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f75163dbfadf4dc777713f8229575ad8c55c3c67ea17b3ab7321fe8cb8b9e5" Dec 04 01:22:19 crc kubenswrapper[4764]: I1204 01:22:19.568928 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzhfh" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.969385 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-c2wc5"] Dec 04 01:22:20 crc kubenswrapper[4764]: E1204 01:22:20.971125 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="322bc7d6-4e25-4535-81ba-59aff3f7331a" containerName="mariadb-database-create" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.971167 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="322bc7d6-4e25-4535-81ba-59aff3f7331a" containerName="mariadb-database-create" Dec 04 01:22:20 crc kubenswrapper[4764]: E1204 01:22:20.971249 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e7fb83-18af-40c9-907f-284ed3a95843" containerName="mariadb-account-create-update" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.971269 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e7fb83-18af-40c9-907f-284ed3a95843" containerName="mariadb-account-create-update" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.971848 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e7fb83-18af-40c9-907f-284ed3a95843" containerName="mariadb-account-create-update" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.971913 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="322bc7d6-4e25-4535-81ba-59aff3f7331a" containerName="mariadb-database-create" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.974863 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.978945 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.979101 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8sw8j" Dec 04 01:22:20 crc kubenswrapper[4764]: I1204 01:22:20.989146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-c2wc5"] Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.130797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.130869 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.131613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8znt\" (UniqueName: \"kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.234228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8znt\" (UniqueName: \"kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.234598 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.234764 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.240907 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.242400 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.267446 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8znt\" (UniqueName: \"kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt\") pod \"heat-db-sync-c2wc5\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.317523 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:21 crc kubenswrapper[4764]: W1204 01:22:21.784140 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd619f27d_9dc1_4cbf_8fab_8085f9521299.slice/crio-84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46 WatchSource:0}: Error finding container 84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46: Status 404 returned error can't find the container with id 84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46 Dec 04 01:22:21 crc kubenswrapper[4764]: I1204 01:22:21.793297 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-c2wc5"] Dec 04 01:22:22 crc kubenswrapper[4764]: I1204 01:22:22.605632 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c2wc5" event={"ID":"d619f27d-9dc1-4cbf-8fab-8085f9521299","Type":"ContainerStarted","Data":"84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46"} Dec 04 01:22:24 crc kubenswrapper[4764]: I1204 01:22:24.965752 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:24 crc kubenswrapper[4764]: I1204 01:22:24.966057 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:26 crc kubenswrapper[4764]: I1204 01:22:26.546160 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:22:26 crc kubenswrapper[4764]: E1204 01:22:26.546909 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:22:27 crc kubenswrapper[4764]: I1204 01:22:27.048485 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-74bff6c665-d8pvt" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Dec 04 01:22:27 crc kubenswrapper[4764]: I1204 01:22:27.048613 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:22:28 crc kubenswrapper[4764]: I1204 01:22:28.674940 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c2wc5" event={"ID":"d619f27d-9dc1-4cbf-8fab-8085f9521299","Type":"ContainerStarted","Data":"44951a39c42c818dfc0a6fefd145c77a8e4abdf5b94f28e4385ce10df458e861"} Dec 04 01:22:28 crc kubenswrapper[4764]: I1204 01:22:28.694515 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-c2wc5" podStartSLOduration=2.159431918 podStartE2EDuration="8.694499267s" podCreationTimestamp="2025-12-04 01:22:20 +0000 UTC" firstStartedPulling="2025-12-04 01:22:21.7870208 +0000 UTC m=+6077.548345231" lastFinishedPulling="2025-12-04 01:22:28.322088159 +0000 UTC m=+6084.083412580" observedRunningTime="2025-12-04 01:22:28.690999821 +0000 UTC m=+6084.452324232" watchObservedRunningTime="2025-12-04 01:22:28.694499267 +0000 UTC m=+6084.455823678" Dec 04 01:22:30 crc kubenswrapper[4764]: I1204 01:22:30.698708 4764 generic.go:334] "Generic (PLEG): container finished" podID="d619f27d-9dc1-4cbf-8fab-8085f9521299" containerID="44951a39c42c818dfc0a6fefd145c77a8e4abdf5b94f28e4385ce10df458e861" exitCode=0 Dec 04 01:22:30 crc kubenswrapper[4764]: I1204 01:22:30.698787 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c2wc5" event={"ID":"d619f27d-9dc1-4cbf-8fab-8085f9521299","Type":"ContainerDied","Data":"44951a39c42c818dfc0a6fefd145c77a8e4abdf5b94f28e4385ce10df458e861"} Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.713029 4764 generic.go:334] "Generic (PLEG): container finished" podID="7586e32c-be69-45bb-a636-89f5a6d55502" containerID="98673d74fc079017d323e98e905ee9976be556fd095905e6bb3f96d3b4b6f936" exitCode=137 Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.713091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerDied","Data":"98673d74fc079017d323e98e905ee9976be556fd095905e6bb3f96d3b4b6f936"} Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.713491 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bff6c665-d8pvt" event={"ID":"7586e32c-be69-45bb-a636-89f5a6d55502","Type":"ContainerDied","Data":"470593dec27f69478a6bab19dc6ccb78970a22066e027966a1020bdd7d41cd06"} Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.713503 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="470593dec27f69478a6bab19dc6ccb78970a22066e027966a1020bdd7d41cd06" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.718043 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.851058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts\") pod \"7586e32c-be69-45bb-a636-89f5a6d55502\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.851254 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key\") pod \"7586e32c-be69-45bb-a636-89f5a6d55502\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.851308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj7b5\" (UniqueName: \"kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5\") pod \"7586e32c-be69-45bb-a636-89f5a6d55502\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.851381 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data\") pod \"7586e32c-be69-45bb-a636-89f5a6d55502\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.851430 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs\") pod \"7586e32c-be69-45bb-a636-89f5a6d55502\" (UID: \"7586e32c-be69-45bb-a636-89f5a6d55502\") " Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.852360 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs" (OuterVolumeSpecName: "logs") pod "7586e32c-be69-45bb-a636-89f5a6d55502" (UID: "7586e32c-be69-45bb-a636-89f5a6d55502"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.857373 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5" (OuterVolumeSpecName: "kube-api-access-jj7b5") pod "7586e32c-be69-45bb-a636-89f5a6d55502" (UID: "7586e32c-be69-45bb-a636-89f5a6d55502"). InnerVolumeSpecName "kube-api-access-jj7b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.871812 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7586e32c-be69-45bb-a636-89f5a6d55502" (UID: "7586e32c-be69-45bb-a636-89f5a6d55502"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.882298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data" (OuterVolumeSpecName: "config-data") pod "7586e32c-be69-45bb-a636-89f5a6d55502" (UID: "7586e32c-be69-45bb-a636-89f5a6d55502"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.929438 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts" (OuterVolumeSpecName: "scripts") pod "7586e32c-be69-45bb-a636-89f5a6d55502" (UID: "7586e32c-be69-45bb-a636-89f5a6d55502"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.954183 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7586e32c-be69-45bb-a636-89f5a6d55502-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.954211 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj7b5\" (UniqueName: \"kubernetes.io/projected/7586e32c-be69-45bb-a636-89f5a6d55502-kube-api-access-jj7b5\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.954221 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.954229 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7586e32c-be69-45bb-a636-89f5a6d55502-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.954238 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7586e32c-be69-45bb-a636-89f5a6d55502-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:31 crc kubenswrapper[4764]: I1204 01:22:31.992036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.055343 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8znt\" (UniqueName: \"kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt\") pod \"d619f27d-9dc1-4cbf-8fab-8085f9521299\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.055493 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data\") pod \"d619f27d-9dc1-4cbf-8fab-8085f9521299\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.055560 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle\") pod \"d619f27d-9dc1-4cbf-8fab-8085f9521299\" (UID: \"d619f27d-9dc1-4cbf-8fab-8085f9521299\") " Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.060554 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt" (OuterVolumeSpecName: "kube-api-access-b8znt") pod "d619f27d-9dc1-4cbf-8fab-8085f9521299" (UID: "d619f27d-9dc1-4cbf-8fab-8085f9521299"). InnerVolumeSpecName "kube-api-access-b8znt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.080948 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d619f27d-9dc1-4cbf-8fab-8085f9521299" (UID: "d619f27d-9dc1-4cbf-8fab-8085f9521299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.139262 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data" (OuterVolumeSpecName: "config-data") pod "d619f27d-9dc1-4cbf-8fab-8085f9521299" (UID: "d619f27d-9dc1-4cbf-8fab-8085f9521299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.157848 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8znt\" (UniqueName: \"kubernetes.io/projected/d619f27d-9dc1-4cbf-8fab-8085f9521299-kube-api-access-b8znt\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.157878 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.157887 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d619f27d-9dc1-4cbf-8fab-8085f9521299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.727372 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bff6c665-d8pvt" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.727366 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-c2wc5" event={"ID":"d619f27d-9dc1-4cbf-8fab-8085f9521299","Type":"ContainerDied","Data":"84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46"} Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.727451 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e60689880e6f41a842de92cbce342ef9699ee3d272d6803c5923e2ab6faa46" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.727373 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-c2wc5" Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.795685 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:22:32 crc kubenswrapper[4764]: I1204 01:22:32.806257 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74bff6c665-d8pvt"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.174278 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-f87fbcd57-sf6pr"] Dec 04 01:22:34 crc kubenswrapper[4764]: E1204 01:22:34.175094 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175110 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" Dec 04 01:22:34 crc kubenswrapper[4764]: E1204 01:22:34.175130 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon-log" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175142 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon-log" Dec 04 01:22:34 crc kubenswrapper[4764]: E1204 01:22:34.175161 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d619f27d-9dc1-4cbf-8fab-8085f9521299" containerName="heat-db-sync" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175169 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="d619f27d-9dc1-4cbf-8fab-8085f9521299" containerName="heat-db-sync" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175394 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="d619f27d-9dc1-4cbf-8fab-8085f9521299" containerName="heat-db-sync" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175421 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.175437 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" containerName="horizon-log" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.182910 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.188534 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.188758 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8sw8j" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.188825 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.222823 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f87fbcd57-sf6pr"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.317883 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-combined-ca-bundle\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.317936 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64b96\" (UniqueName: \"kubernetes.io/projected/fd51fc76-66f3-4cda-9906-631301e6e3c1-kube-api-access-64b96\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.318060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.318091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data-custom\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.318452 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6f4478855c-4tplv"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.328054 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.333321 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.367532 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-667b97b7d7-srfqv"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.380183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.383274 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.401524 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f4478855c-4tplv"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.412543 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-667b97b7d7-srfqv"] Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420243 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420290 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420319 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data-custom\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420381 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-combined-ca-bundle\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420424 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-combined-ca-bundle\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64b96\" (UniqueName: \"kubernetes.io/projected/fd51fc76-66f3-4cda-9906-631301e6e3c1-kube-api-access-64b96\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420838 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data-custom\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.420898 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkwtj\" (UniqueName: \"kubernetes.io/projected/20544f5c-3377-485d-8170-d28325a9f913-kube-api-access-kkwtj\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.427474 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-combined-ca-bundle\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.441650 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.441879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64b96\" (UniqueName: \"kubernetes.io/projected/fd51fc76-66f3-4cda-9906-631301e6e3c1-kube-api-access-64b96\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.444891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd51fc76-66f3-4cda-9906-631301e6e3c1-config-data-custom\") pod \"heat-engine-f87fbcd57-sf6pr\" (UID: \"fd51fc76-66f3-4cda-9906-631301e6e3c1\") " pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.506894 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522052 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-combined-ca-bundle\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522405 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-combined-ca-bundle\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522495 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data-custom\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522686 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vhc\" (UniqueName: \"kubernetes.io/projected/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-kube-api-access-h2vhc\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data-custom\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.522986 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkwtj\" (UniqueName: \"kubernetes.io/projected/20544f5c-3377-485d-8170-d28325a9f913-kube-api-access-kkwtj\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.531302 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.531560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-combined-ca-bundle\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.535624 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/20544f5c-3377-485d-8170-d28325a9f913-config-data-custom\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.542424 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkwtj\" (UniqueName: \"kubernetes.io/projected/20544f5c-3377-485d-8170-d28325a9f913-kube-api-access-kkwtj\") pod \"heat-cfnapi-6f4478855c-4tplv\" (UID: \"20544f5c-3377-485d-8170-d28325a9f913\") " pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.563941 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7586e32c-be69-45bb-a636-89f5a6d55502" path="/var/lib/kubelet/pods/7586e32c-be69-45bb-a636-89f5a6d55502/volumes" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.628154 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data-custom\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.628346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.628421 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vhc\" (UniqueName: \"kubernetes.io/projected/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-kube-api-access-h2vhc\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.628587 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-combined-ca-bundle\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.632474 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-combined-ca-bundle\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.635621 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data-custom\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.645018 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-config-data\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.656743 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vhc\" (UniqueName: \"kubernetes.io/projected/07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9-kube-api-access-h2vhc\") pod \"heat-api-667b97b7d7-srfqv\" (UID: \"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9\") " pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.680680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:34 crc kubenswrapper[4764]: I1204 01:22:34.716419 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:35 crc kubenswrapper[4764]: W1204 01:22:35.031573 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd51fc76_66f3_4cda_9906_631301e6e3c1.slice/crio-835912f0e72fbffe1fc71635a454618d2431964d24ed84b86f60076254485e40 WatchSource:0}: Error finding container 835912f0e72fbffe1fc71635a454618d2431964d24ed84b86f60076254485e40: Status 404 returned error can't find the container with id 835912f0e72fbffe1fc71635a454618d2431964d24ed84b86f60076254485e40 Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.033579 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-f87fbcd57-sf6pr"] Dec 04 01:22:35 crc kubenswrapper[4764]: W1204 01:22:35.161621 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20544f5c_3377_485d_8170_d28325a9f913.slice/crio-ce1d500be78cb68206a7adb7e13d2ba2f09c14a88b927c95558f13a607d24585 WatchSource:0}: Error finding container ce1d500be78cb68206a7adb7e13d2ba2f09c14a88b927c95558f13a607d24585: Status 404 returned error can't find the container with id ce1d500be78cb68206a7adb7e13d2ba2f09c14a88b927c95558f13a607d24585 Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.163467 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6f4478855c-4tplv"] Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.261323 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-667b97b7d7-srfqv"] Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.770019 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f87fbcd57-sf6pr" event={"ID":"fd51fc76-66f3-4cda-9906-631301e6e3c1","Type":"ContainerStarted","Data":"dc18328bdf36ad2ca773f8bda0c4fbe67db2e70f53b05816bacd452ff6d81091"} Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.770292 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-f87fbcd57-sf6pr" event={"ID":"fd51fc76-66f3-4cda-9906-631301e6e3c1","Type":"ContainerStarted","Data":"835912f0e72fbffe1fc71635a454618d2431964d24ed84b86f60076254485e40"} Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.770320 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.776453 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f4478855c-4tplv" event={"ID":"20544f5c-3377-485d-8170-d28325a9f913","Type":"ContainerStarted","Data":"ce1d500be78cb68206a7adb7e13d2ba2f09c14a88b927c95558f13a607d24585"} Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.779136 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667b97b7d7-srfqv" event={"ID":"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9","Type":"ContainerStarted","Data":"000d6ecabb4ab6fb215233238b86d29abf7ff936ab32f837477976ac6c2b5a20"} Dec 04 01:22:35 crc kubenswrapper[4764]: I1204 01:22:35.787994 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-f87fbcd57-sf6pr" podStartSLOduration=1.787976232 podStartE2EDuration="1.787976232s" podCreationTimestamp="2025-12-04 01:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:22:35.785483891 +0000 UTC m=+6091.546808302" watchObservedRunningTime="2025-12-04 01:22:35.787976232 +0000 UTC m=+6091.549300643" Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.047360 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rrg6h"] Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.059770 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b12d-account-create-update-dhqvn"] Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.068104 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b12d-account-create-update-dhqvn"] Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.076225 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rrg6h"] Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.559004 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e11af5e-9683-4a35-9174-af685786f621" path="/var/lib/kubelet/pods/2e11af5e-9683-4a35-9174-af685786f621/volumes" Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.559944 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e" path="/var/lib/kubelet/pods/b266c0bb-c5e6-4a5f-9331-0b8a2ee5398e/volumes" Dec 04 01:22:36 crc kubenswrapper[4764]: I1204 01:22:36.926600 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.546368 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:22:38 crc kubenswrapper[4764]: E1204 01:22:38.547069 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.825751 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667b97b7d7-srfqv" event={"ID":"07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9","Type":"ContainerStarted","Data":"040b0d4c55f66f3f06add28660b469d7c7c0132df43bd04441910429d4b31175"} Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.826192 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.826931 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6f4478855c-4tplv" event={"ID":"20544f5c-3377-485d-8170-d28325a9f913","Type":"ContainerStarted","Data":"fbf9b14901c891743f4edd4b7784b4b24722b6caf1aaab24aa9a64704ea905a8"} Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.827030 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.845475 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-667b97b7d7-srfqv" podStartSLOduration=2.00599573 podStartE2EDuration="4.845455476s" podCreationTimestamp="2025-12-04 01:22:34 +0000 UTC" firstStartedPulling="2025-12-04 01:22:35.258035077 +0000 UTC m=+6091.019359488" lastFinishedPulling="2025-12-04 01:22:38.097494823 +0000 UTC m=+6093.858819234" observedRunningTime="2025-12-04 01:22:38.841697103 +0000 UTC m=+6094.603021514" watchObservedRunningTime="2025-12-04 01:22:38.845455476 +0000 UTC m=+6094.606779887" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.859776 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67554b9ccc-vdgrl" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.862399 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6f4478855c-4tplv" podStartSLOduration=1.886535709 podStartE2EDuration="4.862383173s" podCreationTimestamp="2025-12-04 01:22:34 +0000 UTC" firstStartedPulling="2025-12-04 01:22:35.163733345 +0000 UTC m=+6090.925057756" lastFinishedPulling="2025-12-04 01:22:38.139580809 +0000 UTC m=+6093.900905220" observedRunningTime="2025-12-04 01:22:38.85659203 +0000 UTC m=+6094.617916461" watchObservedRunningTime="2025-12-04 01:22:38.862383173 +0000 UTC m=+6094.623707584" Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.918060 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.918615 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon-log" containerID="cri-o://1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae" gracePeriod=30 Dec 04 01:22:38 crc kubenswrapper[4764]: I1204 01:22:38.919515 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" containerID="cri-o://612cd3f07a344a3ecee7a42c0583f40ad79557e38a3b1edd84984fd07aba56dc" gracePeriod=30 Dec 04 01:22:42 crc kubenswrapper[4764]: I1204 01:22:42.876009 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerID="612cd3f07a344a3ecee7a42c0583f40ad79557e38a3b1edd84984fd07aba56dc" exitCode=0 Dec 04 01:22:42 crc kubenswrapper[4764]: I1204 01:22:42.876075 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerDied","Data":"612cd3f07a344a3ecee7a42c0583f40ad79557e38a3b1edd84984fd07aba56dc"} Dec 04 01:22:45 crc kubenswrapper[4764]: I1204 01:22:45.033034 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-44kxf"] Dec 04 01:22:45 crc kubenswrapper[4764]: I1204 01:22:45.062297 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-44kxf"] Dec 04 01:22:45 crc kubenswrapper[4764]: I1204 01:22:45.994165 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6f4478855c-4tplv" Dec 04 01:22:46 crc kubenswrapper[4764]: I1204 01:22:46.057264 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-667b97b7d7-srfqv" Dec 04 01:22:46 crc kubenswrapper[4764]: I1204 01:22:46.559376 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dea2f4-a927-4fcb-aca3-3ae06d5f7d62" path="/var/lib/kubelet/pods/16dea2f4-a927-4fcb-aca3-3ae06d5f7d62/volumes" Dec 04 01:22:47 crc kubenswrapper[4764]: I1204 01:22:47.721770 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Dec 04 01:22:52 crc kubenswrapper[4764]: I1204 01:22:52.549993 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:22:52 crc kubenswrapper[4764]: E1204 01:22:52.551011 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:22:54 crc kubenswrapper[4764]: I1204 01:22:54.561993 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-f87fbcd57-sf6pr" Dec 04 01:22:57 crc kubenswrapper[4764]: I1204 01:22:57.721621 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Dec 04 01:23:03 crc kubenswrapper[4764]: I1204 01:23:03.546080 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:23:03 crc kubenswrapper[4764]: E1204 01:23:03.547255 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:23:07 crc kubenswrapper[4764]: I1204 01:23:07.723489 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6c6459d4df-dh2rf" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.113:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.113:8080: connect: connection refused" Dec 04 01:23:07 crc kubenswrapper[4764]: I1204 01:23:07.724155 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:23:09 crc kubenswrapper[4764]: E1204 01:23:09.103672 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c7ff3a_e458_4acd_a532_d7cfe9232e03.slice/crio-conmon-1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c7ff3a_e458_4acd_a532_d7cfe9232e03.slice/crio-1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae.scope\": RecentStats: unable to find data in memory cache]" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.178614 4764 generic.go:334] "Generic (PLEG): container finished" podID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerID="1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae" exitCode=137 Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.178667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerDied","Data":"1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae"} Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.456815 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.630191 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key\") pod \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.630325 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts\") pod \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.630447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs\") pod \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.630623 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data\") pod \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.630809 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrsc\" (UniqueName: \"kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc\") pod \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\" (UID: \"e3c7ff3a-e458-4acd-a532-d7cfe9232e03\") " Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.631062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs" (OuterVolumeSpecName: "logs") pod "e3c7ff3a-e458-4acd-a532-d7cfe9232e03" (UID: "e3c7ff3a-e458-4acd-a532-d7cfe9232e03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.631535 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-logs\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.638296 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e3c7ff3a-e458-4acd-a532-d7cfe9232e03" (UID: "e3c7ff3a-e458-4acd-a532-d7cfe9232e03"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.639107 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc" (OuterVolumeSpecName: "kube-api-access-xbrsc") pod "e3c7ff3a-e458-4acd-a532-d7cfe9232e03" (UID: "e3c7ff3a-e458-4acd-a532-d7cfe9232e03"). InnerVolumeSpecName "kube-api-access-xbrsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.680852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data" (OuterVolumeSpecName: "config-data") pod "e3c7ff3a-e458-4acd-a532-d7cfe9232e03" (UID: "e3c7ff3a-e458-4acd-a532-d7cfe9232e03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.683266 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts" (OuterVolumeSpecName: "scripts") pod "e3c7ff3a-e458-4acd-a532-d7cfe9232e03" (UID: "e3c7ff3a-e458-4acd-a532-d7cfe9232e03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.733647 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrsc\" (UniqueName: \"kubernetes.io/projected/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-kube-api-access-xbrsc\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.733687 4764 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.733701 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:09 crc kubenswrapper[4764]: I1204 01:23:09.733759 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3c7ff3a-e458-4acd-a532-d7cfe9232e03-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.188856 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6459d4df-dh2rf" event={"ID":"e3c7ff3a-e458-4acd-a532-d7cfe9232e03","Type":"ContainerDied","Data":"25a06564c9e8fffe545625166e0321b47f929d2c63b760414e3b2cf90293a5ff"} Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.188888 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6459d4df-dh2rf" Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.188925 4764 scope.go:117] "RemoveContainer" containerID="612cd3f07a344a3ecee7a42c0583f40ad79557e38a3b1edd84984fd07aba56dc" Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.225689 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.234103 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6c6459d4df-dh2rf"] Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.381392 4764 scope.go:117] "RemoveContainer" containerID="1868a48ffb4e1e5504e957cf2121463cda6da07f3b3df065a0c1c5e2bde4c9ae" Dec 04 01:23:10 crc kubenswrapper[4764]: I1204 01:23:10.566514 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" path="/var/lib/kubelet/pods/e3c7ff3a-e458-4acd-a532-d7cfe9232e03/volumes" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.082589 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw"] Dec 04 01:23:12 crc kubenswrapper[4764]: E1204 01:23:12.083112 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon-log" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.083128 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon-log" Dec 04 01:23:12 crc kubenswrapper[4764]: E1204 01:23:12.083147 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.083156 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.083426 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon-log" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.083451 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c7ff3a-e458-4acd-a532-d7cfe9232e03" containerName="horizon" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.088516 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.091906 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.125544 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw"] Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.179175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpcs\" (UniqueName: \"kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.179224 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.179350 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.280301 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.280435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpcs\" (UniqueName: \"kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.280458 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.280906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.280913 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.307911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpcs\" (UniqueName: \"kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.423743 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:12 crc kubenswrapper[4764]: I1204 01:23:12.893659 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw"] Dec 04 01:23:12 crc kubenswrapper[4764]: W1204 01:23:12.894613 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd7f558_826d_4e33_bf17_9021f28ce1e6.slice/crio-c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df WatchSource:0}: Error finding container c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df: Status 404 returned error can't find the container with id c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df Dec 04 01:23:13 crc kubenswrapper[4764]: I1204 01:23:13.241374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerStarted","Data":"f15afcd4d0ada6579cfbacb050ac553da4d0fee2ff85d9cdb95891f15d321624"} Dec 04 01:23:13 crc kubenswrapper[4764]: I1204 01:23:13.241678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerStarted","Data":"c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df"} Dec 04 01:23:13 crc kubenswrapper[4764]: I1204 01:23:13.241795 4764 scope.go:117] "RemoveContainer" containerID="33dbced410566aca91de81299f67c654162c0cc7fb3bf152da675bf0311337b4" Dec 04 01:23:13 crc kubenswrapper[4764]: I1204 01:23:13.286851 4764 scope.go:117] "RemoveContainer" containerID="d5c1d76698af66d19b806724063cd0182c5c74477e6a32d5519018720bf77beb" Dec 04 01:23:13 crc kubenswrapper[4764]: I1204 01:23:13.307937 4764 scope.go:117] "RemoveContainer" containerID="0b0fdf979fbceb888396dd108d754b0e720e42d387e7eef147f3e2a44b2b32eb" Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.067337 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-eb72-account-create-update-6hb8g"] Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.078975 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m9pr5"] Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.091103 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-eb72-account-create-update-6hb8g"] Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.102906 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m9pr5"] Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.254382 4764 generic.go:334] "Generic (PLEG): container finished" podID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerID="f15afcd4d0ada6579cfbacb050ac553da4d0fee2ff85d9cdb95891f15d321624" exitCode=0 Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.254428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerDied","Data":"f15afcd4d0ada6579cfbacb050ac553da4d0fee2ff85d9cdb95891f15d321624"} Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.571823 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf22e73-1e49-40e3-b33e-c0b5b9391f2f" path="/var/lib/kubelet/pods/6bf22e73-1e49-40e3-b33e-c0b5b9391f2f/volumes" Dec 04 01:23:14 crc kubenswrapper[4764]: I1204 01:23:14.574293 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43" path="/var/lib/kubelet/pods/ffbc5ed6-8b62-4371-a4e0-dd65fdcf0a43/volumes" Dec 04 01:23:16 crc kubenswrapper[4764]: I1204 01:23:16.278188 4764 generic.go:334] "Generic (PLEG): container finished" podID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerID="7eb847e8f94b3897bb760589dc1d62e5feab30efdd1f57febdd9669a5d296927" exitCode=0 Dec 04 01:23:16 crc kubenswrapper[4764]: I1204 01:23:16.278252 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerDied","Data":"7eb847e8f94b3897bb760589dc1d62e5feab30efdd1f57febdd9669a5d296927"} Dec 04 01:23:17 crc kubenswrapper[4764]: I1204 01:23:17.296064 4764 generic.go:334] "Generic (PLEG): container finished" podID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerID="28e353d5323153149a63be3ddf951da3778aec7e5a79e64fe83275ff51c24a8e" exitCode=0 Dec 04 01:23:17 crc kubenswrapper[4764]: I1204 01:23:17.296104 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerDied","Data":"28e353d5323153149a63be3ddf951da3778aec7e5a79e64fe83275ff51c24a8e"} Dec 04 01:23:17 crc kubenswrapper[4764]: I1204 01:23:17.560329 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:23:17 crc kubenswrapper[4764]: E1204 01:23:17.560542 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.742335 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.937330 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle\") pod \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.938372 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crpcs\" (UniqueName: \"kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs\") pod \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.938504 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util\") pod \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\" (UID: \"4dd7f558-826d-4e33-bf17-9021f28ce1e6\") " Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.942989 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle" (OuterVolumeSpecName: "bundle") pod "4dd7f558-826d-4e33-bf17-9021f28ce1e6" (UID: "4dd7f558-826d-4e33-bf17-9021f28ce1e6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.945975 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs" (OuterVolumeSpecName: "kube-api-access-crpcs") pod "4dd7f558-826d-4e33-bf17-9021f28ce1e6" (UID: "4dd7f558-826d-4e33-bf17-9021f28ce1e6"). InnerVolumeSpecName "kube-api-access-crpcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:23:18 crc kubenswrapper[4764]: I1204 01:23:18.959107 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util" (OuterVolumeSpecName: "util") pod "4dd7f558-826d-4e33-bf17-9021f28ce1e6" (UID: "4dd7f558-826d-4e33-bf17-9021f28ce1e6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.042309 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crpcs\" (UniqueName: \"kubernetes.io/projected/4dd7f558-826d-4e33-bf17-9021f28ce1e6-kube-api-access-crpcs\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.042371 4764 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-util\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.042393 4764 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4dd7f558-826d-4e33-bf17-9021f28ce1e6-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.325042 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" event={"ID":"4dd7f558-826d-4e33-bf17-9021f28ce1e6","Type":"ContainerDied","Data":"c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df"} Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.325399 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ff00e4d6b2a906cbe5dd1fa4eefbc9657c898a0183b8dccb791e01b4eb92df" Dec 04 01:23:19 crc kubenswrapper[4764]: I1204 01:23:19.325104 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw" Dec 04 01:23:21 crc kubenswrapper[4764]: I1204 01:23:21.044495 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-pzdx2"] Dec 04 01:23:21 crc kubenswrapper[4764]: I1204 01:23:21.056796 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-pzdx2"] Dec 04 01:23:22 crc kubenswrapper[4764]: I1204 01:23:22.560692 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="463f5236-7838-4635-a52b-ca2a2ef4f477" path="/var/lib/kubelet/pods/463f5236-7838-4635-a52b-ca2a2ef4f477/volumes" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.546701 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:23:30 crc kubenswrapper[4764]: E1204 01:23:30.547536 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.579392 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25"] Dec 04 01:23:30 crc kubenswrapper[4764]: E1204 01:23:30.579912 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="extract" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.579931 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="extract" Dec 04 01:23:30 crc kubenswrapper[4764]: E1204 01:23:30.579949 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="util" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.579955 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="util" Dec 04 01:23:30 crc kubenswrapper[4764]: E1204 01:23:30.579978 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="pull" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.579985 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="pull" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.580198 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd7f558-826d-4e33-bf17-9021f28ce1e6" containerName="extract" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.581665 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.612367 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.612556 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-z6nxr" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.612634 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.621141 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.688550 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.689872 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.693280 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-56lw9" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.693640 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.697487 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.698852 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.710725 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.732621 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4lxh\" (UniqueName: \"kubernetes.io/projected/cb33cd08-98e5-454a-85df-bf1f1c711c48-kube-api-access-f4lxh\") pod \"obo-prometheus-operator-668cf9dfbb-lfv25\" (UID: \"cb33cd08-98e5-454a-85df-bf1f1c711c48\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.735492 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.834195 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4lxh\" (UniqueName: \"kubernetes.io/projected/cb33cd08-98e5-454a-85df-bf1f1c711c48-kube-api-access-f4lxh\") pod \"obo-prometheus-operator-668cf9dfbb-lfv25\" (UID: \"cb33cd08-98e5-454a-85df-bf1f1c711c48\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.834275 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.834321 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.834378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.834407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.860636 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4lxh\" (UniqueName: \"kubernetes.io/projected/cb33cd08-98e5-454a-85df-bf1f1c711c48-kube-api-access-f4lxh\") pod \"obo-prometheus-operator-668cf9dfbb-lfv25\" (UID: \"cb33cd08-98e5-454a-85df-bf1f1c711c48\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.930202 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cztpp"] Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.931811 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.934757 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mscdm" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.936027 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.936093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.936158 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.936178 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.937235 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.939129 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.940978 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.941876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8490b392-1234-48b9-8522-d7e07fca695d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k\" (UID: \"8490b392-1234-48b9-8522-d7e07fca695d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.942180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.947421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1a8ae582-b26e-4da5-9474-d2e049f6d86e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s\" (UID: \"1a8ae582-b26e-4da5-9474-d2e049f6d86e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:30 crc kubenswrapper[4764]: I1204 01:23:30.985780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cztpp"] Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.015411 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.037317 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.037897 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d44a2888-4635-4cac-a1d4-f68fd374072f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.037977 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsjgs\" (UniqueName: \"kubernetes.io/projected/d44a2888-4635-4cac-a1d4-f68fd374072f-kube-api-access-bsjgs\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.140878 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d44a2888-4635-4cac-a1d4-f68fd374072f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.140957 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsjgs\" (UniqueName: \"kubernetes.io/projected/d44a2888-4635-4cac-a1d4-f68fd374072f-kube-api-access-bsjgs\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.149638 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/d44a2888-4635-4cac-a1d4-f68fd374072f-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.169727 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5d9jn"] Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.171176 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.176154 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsjgs\" (UniqueName: \"kubernetes.io/projected/d44a2888-4635-4cac-a1d4-f68fd374072f-kube-api-access-bsjgs\") pod \"observability-operator-d8bb48f5d-cztpp\" (UID: \"d44a2888-4635-4cac-a1d4-f68fd374072f\") " pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.228133 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-kw27p" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.277441 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5d9jn"] Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.347910 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed402497-6092-40f1-912c-6c7d59ef70f2-openshift-service-ca\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.347976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvr9k\" (UniqueName: \"kubernetes.io/projected/ed402497-6092-40f1-912c-6c7d59ef70f2-kube-api-access-jvr9k\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.426408 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.450093 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed402497-6092-40f1-912c-6c7d59ef70f2-openshift-service-ca\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.450160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvr9k\" (UniqueName: \"kubernetes.io/projected/ed402497-6092-40f1-912c-6c7d59ef70f2-kube-api-access-jvr9k\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.451325 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed402497-6092-40f1-912c-6c7d59ef70f2-openshift-service-ca\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.470110 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvr9k\" (UniqueName: \"kubernetes.io/projected/ed402497-6092-40f1-912c-6c7d59ef70f2-kube-api-access-jvr9k\") pod \"perses-operator-5446b9c989-5d9jn\" (UID: \"ed402497-6092-40f1-912c-6c7d59ef70f2\") " pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:31 crc kubenswrapper[4764]: I1204 01:23:31.497954 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:32 crc kubenswrapper[4764]: I1204 01:23:32.575877 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-5d9jn"] Dec 04 01:23:32 crc kubenswrapper[4764]: I1204 01:23:32.590448 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s"] Dec 04 01:23:32 crc kubenswrapper[4764]: I1204 01:23:32.670784 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25"] Dec 04 01:23:32 crc kubenswrapper[4764]: I1204 01:23:32.693773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k"] Dec 04 01:23:32 crc kubenswrapper[4764]: I1204 01:23:32.720523 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-cztpp"] Dec 04 01:23:33 crc kubenswrapper[4764]: I1204 01:23:33.543102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" event={"ID":"1a8ae582-b26e-4da5-9474-d2e049f6d86e","Type":"ContainerStarted","Data":"e86c5d0f292daf0a56dc2c0b1605dbc36671c3e79bb7f6e50867e47f489d66c7"} Dec 04 01:23:33 crc kubenswrapper[4764]: I1204 01:23:33.552979 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" event={"ID":"cb33cd08-98e5-454a-85df-bf1f1c711c48","Type":"ContainerStarted","Data":"6e289fd8c08f6917bbbe18786240f40fb660ea8e0211eebd486f131e1deb8f63"} Dec 04 01:23:33 crc kubenswrapper[4764]: I1204 01:23:33.555467 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" event={"ID":"8490b392-1234-48b9-8522-d7e07fca695d","Type":"ContainerStarted","Data":"c3a28993a2f1de95a0d4aaa81f2c2eb6e3a0ca43465f5efc3d299154331e5bd4"} Dec 04 01:23:33 crc kubenswrapper[4764]: I1204 01:23:33.559029 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" event={"ID":"ed402497-6092-40f1-912c-6c7d59ef70f2","Type":"ContainerStarted","Data":"0e72f361928b2d4b4dc58416cbf1a00d097c88ea7010104951731c9736fb3990"} Dec 04 01:23:33 crc kubenswrapper[4764]: I1204 01:23:33.561843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" event={"ID":"d44a2888-4635-4cac-a1d4-f68fd374072f","Type":"ContainerStarted","Data":"87866f9de89817f446666b0ba251904ab26c614c20add49da1b0908021e7b33e"} Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.397418 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.399860 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.445273 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.445364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.445388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdzj\" (UniqueName: \"kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.447773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.549926 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.550393 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.550444 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdzj\" (UniqueName: \"kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.550628 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.550879 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.580211 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdzj\" (UniqueName: \"kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj\") pod \"certified-operators-cjj9r\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:34 crc kubenswrapper[4764]: I1204 01:23:34.723071 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:39 crc kubenswrapper[4764]: I1204 01:23:39.227624 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.677276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" event={"ID":"ed402497-6092-40f1-912c-6c7d59ef70f2","Type":"ContainerStarted","Data":"6968bf16b8151141494464e3c5aadce6fdfaaf2c5b96cc88d84cd128d1fd897a"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.677816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.684112 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" event={"ID":"d44a2888-4635-4cac-a1d4-f68fd374072f","Type":"ContainerStarted","Data":"4dbcb0a61fa026da15ad0926f734d358f47344e8415ca5784cf973ad40ed781c"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.684334 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.688297 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.692105 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" event={"ID":"1a8ae582-b26e-4da5-9474-d2e049f6d86e","Type":"ContainerStarted","Data":"4061dad792fb9f2baebfdbf4aa7ea45dc0a7437b25a42f745129ce447fee55f5"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.698421 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" event={"ID":"8490b392-1234-48b9-8522-d7e07fca695d","Type":"ContainerStarted","Data":"af4d61653eb237210db297c3ba71605b498e26540a8e768c74b9953a60aebf41"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.703101 4764 generic.go:334] "Generic (PLEG): container finished" podID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerID="371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec" exitCode=0 Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.703164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerDied","Data":"371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.703214 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerStarted","Data":"7207b1c85f5aa9c12901291f6832c9186fdf9c43974676588ef3548808fa8ae9"} Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.713663 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" podStartSLOduration=2.34888956 podStartE2EDuration="11.713645896s" podCreationTimestamp="2025-12-04 01:23:31 +0000 UTC" firstStartedPulling="2025-12-04 01:23:32.649163535 +0000 UTC m=+6148.410487946" lastFinishedPulling="2025-12-04 01:23:42.013919871 +0000 UTC m=+6157.775244282" observedRunningTime="2025-12-04 01:23:42.707762881 +0000 UTC m=+6158.469087292" watchObservedRunningTime="2025-12-04 01:23:42.713645896 +0000 UTC m=+6158.474970307" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.752374 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s" podStartSLOduration=3.441835758 podStartE2EDuration="12.74875302s" podCreationTimestamp="2025-12-04 01:23:30 +0000 UTC" firstStartedPulling="2025-12-04 01:23:32.614958823 +0000 UTC m=+6148.376283234" lastFinishedPulling="2025-12-04 01:23:41.921876075 +0000 UTC m=+6157.683200496" observedRunningTime="2025-12-04 01:23:42.738880777 +0000 UTC m=+6158.500205178" watchObservedRunningTime="2025-12-04 01:23:42.74875302 +0000 UTC m=+6158.510077421" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.787510 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k" podStartSLOduration=3.5150098290000003 podStartE2EDuration="12.787490643s" podCreationTimestamp="2025-12-04 01:23:30 +0000 UTC" firstStartedPulling="2025-12-04 01:23:32.648807146 +0000 UTC m=+6148.410131547" lastFinishedPulling="2025-12-04 01:23:41.92128794 +0000 UTC m=+6157.682612361" observedRunningTime="2025-12-04 01:23:42.773097989 +0000 UTC m=+6158.534422400" watchObservedRunningTime="2025-12-04 01:23:42.787490643 +0000 UTC m=+6158.548815044" Dec 04 01:23:42 crc kubenswrapper[4764]: I1204 01:23:42.848454 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-cztpp" podStartSLOduration=3.526862592 podStartE2EDuration="12.848431644s" podCreationTimestamp="2025-12-04 01:23:30 +0000 UTC" firstStartedPulling="2025-12-04 01:23:32.736898685 +0000 UTC m=+6148.498223096" lastFinishedPulling="2025-12-04 01:23:42.058467737 +0000 UTC m=+6157.819792148" observedRunningTime="2025-12-04 01:23:42.84179212 +0000 UTC m=+6158.603116521" watchObservedRunningTime="2025-12-04 01:23:42.848431644 +0000 UTC m=+6158.609756065" Dec 04 01:23:43 crc kubenswrapper[4764]: I1204 01:23:43.546869 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:23:43 crc kubenswrapper[4764]: E1204 01:23:43.547396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:23:43 crc kubenswrapper[4764]: I1204 01:23:43.714779 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" event={"ID":"cb33cd08-98e5-454a-85df-bf1f1c711c48","Type":"ContainerStarted","Data":"dd68e9130a2646d2e42915204564bded7a6902ca771faa6d13c0d6af357b24e7"} Dec 04 01:23:43 crc kubenswrapper[4764]: I1204 01:23:43.717216 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerStarted","Data":"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4"} Dec 04 01:23:43 crc kubenswrapper[4764]: I1204 01:23:43.741129 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-lfv25" podStartSLOduration=4.431635723 podStartE2EDuration="13.741112388s" podCreationTimestamp="2025-12-04 01:23:30 +0000 UTC" firstStartedPulling="2025-12-04 01:23:32.636990035 +0000 UTC m=+6148.398314446" lastFinishedPulling="2025-12-04 01:23:41.9464667 +0000 UTC m=+6157.707791111" observedRunningTime="2025-12-04 01:23:43.736803442 +0000 UTC m=+6159.498127863" watchObservedRunningTime="2025-12-04 01:23:43.741112388 +0000 UTC m=+6159.502436799" Dec 04 01:23:45 crc kubenswrapper[4764]: I1204 01:23:45.740200 4764 generic.go:334] "Generic (PLEG): container finished" podID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerID="53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4" exitCode=0 Dec 04 01:23:45 crc kubenswrapper[4764]: I1204 01:23:45.740333 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerDied","Data":"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4"} Dec 04 01:23:46 crc kubenswrapper[4764]: I1204 01:23:46.755086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerStarted","Data":"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212"} Dec 04 01:23:46 crc kubenswrapper[4764]: I1204 01:23:46.779909 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cjj9r" podStartSLOduration=9.291744576 podStartE2EDuration="12.779889112s" podCreationTimestamp="2025-12-04 01:23:34 +0000 UTC" firstStartedPulling="2025-12-04 01:23:42.70570953 +0000 UTC m=+6158.467033941" lastFinishedPulling="2025-12-04 01:23:46.193854066 +0000 UTC m=+6161.955178477" observedRunningTime="2025-12-04 01:23:46.7745234 +0000 UTC m=+6162.535847851" watchObservedRunningTime="2025-12-04 01:23:46.779889112 +0000 UTC m=+6162.541213533" Dec 04 01:23:51 crc kubenswrapper[4764]: I1204 01:23:51.501851 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-5d9jn" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.058746 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.059428 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" containerName="openstackclient" containerID="cri-o://92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123" gracePeriod=2 Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.074465 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.099128 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 01:23:54 crc kubenswrapper[4764]: E1204 01:23:54.099857 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" containerName="openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.099879 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" containerName="openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.100089 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" containerName="openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.100875 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.110941 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" podUID="dc6364bf-30dd-48f6-813a-bc1ece8188a4" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.116874 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.220041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.220425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmgc\" (UniqueName: \"kubernetes.io/projected/dc6364bf-30dd-48f6-813a-bc1ece8188a4-kube-api-access-hhmgc\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.220805 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.239149 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.240445 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.245201 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-v7lnm" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.249474 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.322913 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmgc\" (UniqueName: \"kubernetes.io/projected/dc6364bf-30dd-48f6-813a-bc1ece8188a4-kube-api-access-hhmgc\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.323342 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.323412 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.325158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.334265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/dc6364bf-30dd-48f6-813a-bc1ece8188a4-openstack-config-secret\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.354648 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmgc\" (UniqueName: \"kubernetes.io/projected/dc6364bf-30dd-48f6-813a-bc1ece8188a4-kube-api-access-hhmgc\") pod \"openstackclient\" (UID: \"dc6364bf-30dd-48f6-813a-bc1ece8188a4\") " pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.420151 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.424541 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7zk7\" (UniqueName: \"kubernetes.io/projected/9e25921f-2ca3-4360-a344-fe779ac2ac52-kube-api-access-b7zk7\") pod \"kube-state-metrics-0\" (UID: \"9e25921f-2ca3-4360-a344-fe779ac2ac52\") " pod="openstack/kube-state-metrics-0" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.528552 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7zk7\" (UniqueName: \"kubernetes.io/projected/9e25921f-2ca3-4360-a344-fe779ac2ac52-kube-api-access-b7zk7\") pod \"kube-state-metrics-0\" (UID: \"9e25921f-2ca3-4360-a344-fe779ac2ac52\") " pod="openstack/kube-state-metrics-0" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.565776 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7zk7\" (UniqueName: \"kubernetes.io/projected/9e25921f-2ca3-4360-a344-fe779ac2ac52-kube-api-access-b7zk7\") pod \"kube-state-metrics-0\" (UID: \"9e25921f-2ca3-4360-a344-fe779ac2ac52\") " pod="openstack/kube-state-metrics-0" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.733952 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.734307 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.820323 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.854492 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 01:23:54 crc kubenswrapper[4764]: I1204 01:23:54.940623 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.048607 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.051136 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.056345 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.058237 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-5454v" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.058412 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.058507 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.058602 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.079476 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.105818 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.147550 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 01:23:55 crc kubenswrapper[4764]: W1204 01:23:55.178200 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc6364bf_30dd_48f6_813a_bc1ece8188a4.slice/crio-173370f9e826f81b94383e731e85900302d19ce4507e586fd5aab24748dfe5a0 WatchSource:0}: Error finding container 173370f9e826f81b94383e731e85900302d19ce4507e586fd5aab24748dfe5a0: Status 404 returned error can't find the container with id 173370f9e826f81b94383e731e85900302d19ce4507e586fd5aab24748dfe5a0 Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.267434 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.267949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.268113 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.268132 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffdld\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-kube-api-access-ffdld\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.268158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.268357 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.268442 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372395 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372423 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372482 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372545 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372564 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffdld\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-kube-api-access-ffdld\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.372587 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.377063 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.377540 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.394378 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.395383 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.398640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.398986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/97e1e583-be68-470d-a9fd-b8bc27831cd4-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.421687 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffdld\" (UniqueName: \"kubernetes.io/projected/97e1e583-be68-470d-a9fd-b8bc27831cd4-kube-api-access-ffdld\") pod \"alertmanager-metric-storage-0\" (UID: \"97e1e583-be68-470d-a9fd-b8bc27831cd4\") " pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.546741 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:23:55 crc kubenswrapper[4764]: E1204 01:23:55.547020 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.596401 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.599648 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.611900 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.612629 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-7slx6" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.612770 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.612239 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.612973 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.613005 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.696361 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.760345 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.839376 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.850995 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851095 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91dceff4-dc61-4803-98fb-da530493e50c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851192 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851531 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/91dceff4-dc61-4803-98fb-da530493e50c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851681 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851762 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851835 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.851878 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4wc\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-kube-api-access-vk4wc\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.952465 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"dc6364bf-30dd-48f6-813a-bc1ece8188a4","Type":"ContainerStarted","Data":"173370f9e826f81b94383e731e85900302d19ce4507e586fd5aab24748dfe5a0"} Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953345 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953435 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4wc\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-kube-api-access-vk4wc\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953496 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91dceff4-dc61-4803-98fb-da530493e50c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953564 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/91dceff4-dc61-4803-98fb-da530493e50c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.953657 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.960987 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e25921f-2ca3-4360-a344-fe779ac2ac52","Type":"ContainerStarted","Data":"865443e035f55fe07a3d90e43294f1806be250d2beb63ebd23114b79ca95aa5b"} Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.963414 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.964440 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.964809 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/91dceff4-dc61-4803-98fb-da530493e50c-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.968123 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/91dceff4-dc61-4803-98fb-da530493e50c-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.969321 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/91dceff4-dc61-4803-98fb-da530493e50c-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.975902 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:55 crc kubenswrapper[4764]: I1204 01:23:55.985680 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4wc\" (UniqueName: \"kubernetes.io/projected/91dceff4-dc61-4803-98fb-da530493e50c-kube-api-access-vk4wc\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.006746 4764 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.006788 4764 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5d927250dcadcb05055bb870986db9e9f77c6a787543223601ca217f76fca4ab/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.246051 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c56a40ed-fd08-4a5c-9f68-e932678ea670\") pod \"prometheus-metric-storage-0\" (UID: \"91dceff4-dc61-4803-98fb-da530493e50c\") " pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.528467 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.626359 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.716748 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.723618 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" podUID="dc6364bf-30dd-48f6-813a-bc1ece8188a4" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.876404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwbk4\" (UniqueName: \"kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4\") pod \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.876873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config\") pod \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.876941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret\") pod \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\" (UID: \"9492ba61-0ca6-433e-9eac-9819f2f0ff4c\") " Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.888222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4" (OuterVolumeSpecName: "kube-api-access-kwbk4") pod "9492ba61-0ca6-433e-9eac-9819f2f0ff4c" (UID: "9492ba61-0ca6-433e-9eac-9819f2f0ff4c"). InnerVolumeSpecName "kube-api-access-kwbk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:23:56 crc kubenswrapper[4764]: I1204 01:23:56.916634 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9492ba61-0ca6-433e-9eac-9819f2f0ff4c" (UID: "9492ba61-0ca6-433e-9eac-9819f2f0ff4c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.021252 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.021283 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwbk4\" (UniqueName: \"kubernetes.io/projected/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-kube-api-access-kwbk4\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.021637 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9492ba61-0ca6-433e-9eac-9819f2f0ff4c" (UID: "9492ba61-0ca6-433e-9eac-9819f2f0ff4c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.028903 4764 generic.go:334] "Generic (PLEG): container finished" podID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" containerID="92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123" exitCode=137 Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.028976 4764 scope.go:117] "RemoveContainer" containerID="92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.029097 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.031241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"97e1e583-be68-470d-a9fd-b8bc27831cd4","Type":"ContainerStarted","Data":"cb47230814379093aaf08e10129176db4cc35ef7f33755c00b7621f1a3551f93"} Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.032388 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" podUID="dc6364bf-30dd-48f6-813a-bc1ece8188a4" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.048025 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"dc6364bf-30dd-48f6-813a-bc1ece8188a4","Type":"ContainerStarted","Data":"cc93f0f55ea5a3f1d70170644a5cd707dbda886bfca74186fdf46c4b915007fd"} Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.049984 4764 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" podUID="dc6364bf-30dd-48f6-813a-bc1ece8188a4" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.057794 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9e25921f-2ca3-4360-a344-fe779ac2ac52","Type":"ContainerStarted","Data":"04ad271a6dbe5f9187a8d43341fa49541f45da152fc5e6e39f8392b60eee93ec"} Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.057788 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cjj9r" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="registry-server" containerID="cri-o://bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212" gracePeriod=2 Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.058731 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.062896 4764 scope.go:117] "RemoveContainer" containerID="92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123" Dec 04 01:23:57 crc kubenswrapper[4764]: E1204 01:23:57.064819 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123\": container with ID starting with 92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123 not found: ID does not exist" containerID="92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.064859 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123"} err="failed to get container status \"92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123\": rpc error: code = NotFound desc = could not find container \"92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123\": container with ID starting with 92077d5e92788eb154a24ae0d0cda610431921846d15d9dd31f5530d641b1123 not found: ID does not exist" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.108357 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.10833882 podStartE2EDuration="3.10833882s" podCreationTimestamp="2025-12-04 01:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:23:57.074445456 +0000 UTC m=+6172.835769867" watchObservedRunningTime="2025-12-04 01:23:57.10833882 +0000 UTC m=+6172.869663231" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.113124 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.581292746 podStartE2EDuration="3.113113838s" podCreationTimestamp="2025-12-04 01:23:54 +0000 UTC" firstStartedPulling="2025-12-04 01:23:55.711391662 +0000 UTC m=+6171.472716073" lastFinishedPulling="2025-12-04 01:23:56.243212754 +0000 UTC m=+6172.004537165" observedRunningTime="2025-12-04 01:23:57.100431725 +0000 UTC m=+6172.861756136" watchObservedRunningTime="2025-12-04 01:23:57.113113838 +0000 UTC m=+6172.874438249" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.125197 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.126094 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9492ba61-0ca6-433e-9eac-9819f2f0ff4c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.682053 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.746833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities\") pod \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.747486 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdzj\" (UniqueName: \"kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj\") pod \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.747586 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content\") pod \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\" (UID: \"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed\") " Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.748313 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities" (OuterVolumeSpecName: "utilities") pod "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" (UID: "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.749305 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.773401 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj" (OuterVolumeSpecName: "kube-api-access-2jdzj") pod "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" (UID: "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed"). InnerVolumeSpecName "kube-api-access-2jdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.805306 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" (UID: "8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.850634 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdzj\" (UniqueName: \"kubernetes.io/projected/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-kube-api-access-2jdzj\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:57 crc kubenswrapper[4764]: I1204 01:23:57.850659 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.066374 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerStarted","Data":"57370461a72abc279af003903a5ec100a97efb11b9d125088d496a2774f17cd1"} Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.080154 4764 generic.go:334] "Generic (PLEG): container finished" podID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerID="bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212" exitCode=0 Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.081137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cjj9r" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.088505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerDied","Data":"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212"} Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.088549 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cjj9r" event={"ID":"8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed","Type":"ContainerDied","Data":"7207b1c85f5aa9c12901291f6832c9186fdf9c43974676588ef3548808fa8ae9"} Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.088566 4764 scope.go:117] "RemoveContainer" containerID="bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.145227 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.161845 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cjj9r"] Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.171316 4764 scope.go:117] "RemoveContainer" containerID="53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.217238 4764 scope.go:117] "RemoveContainer" containerID="371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.273497 4764 scope.go:117] "RemoveContainer" containerID="bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212" Dec 04 01:23:58 crc kubenswrapper[4764]: E1204 01:23:58.280300 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212\": container with ID starting with bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212 not found: ID does not exist" containerID="bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.280340 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212"} err="failed to get container status \"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212\": rpc error: code = NotFound desc = could not find container \"bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212\": container with ID starting with bf822dbfb2a5744e244f51414032f70bb615d9c18c9df1799eebaa59fc3cc212 not found: ID does not exist" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.280367 4764 scope.go:117] "RemoveContainer" containerID="53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4" Dec 04 01:23:58 crc kubenswrapper[4764]: E1204 01:23:58.280889 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4\": container with ID starting with 53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4 not found: ID does not exist" containerID="53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.280942 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4"} err="failed to get container status \"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4\": rpc error: code = NotFound desc = could not find container \"53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4\": container with ID starting with 53adcc86c891857ca9667fe549815b0e5885f1fe44ccc9f75b540e0f87545ac4 not found: ID does not exist" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.280975 4764 scope.go:117] "RemoveContainer" containerID="371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec" Dec 04 01:23:58 crc kubenswrapper[4764]: E1204 01:23:58.282143 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec\": container with ID starting with 371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec not found: ID does not exist" containerID="371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.282172 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec"} err="failed to get container status \"371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec\": rpc error: code = NotFound desc = could not find container \"371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec\": container with ID starting with 371f5a3dfe8bf1373dbe1c72f310d8509dd176e44ce3d8f3becfd944dd9364ec not found: ID does not exist" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.569947 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" path="/var/lib/kubelet/pods/8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed/volumes" Dec 04 01:23:58 crc kubenswrapper[4764]: I1204 01:23:58.575008 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9492ba61-0ca6-433e-9eac-9819f2f0ff4c" path="/var/lib/kubelet/pods/9492ba61-0ca6-433e-9eac-9819f2f0ff4c/volumes" Dec 04 01:24:03 crc kubenswrapper[4764]: I1204 01:24:03.156005 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"97e1e583-be68-470d-a9fd-b8bc27831cd4","Type":"ContainerStarted","Data":"c2343754bdbaec6cb5a03e4dfdcc123e322c02fda3fa26f2c239284203f2d02c"} Dec 04 01:24:04 crc kubenswrapper[4764]: I1204 01:24:04.171954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerStarted","Data":"96926586d3a93854334d43c71eba1ef4122507e25266ecc54336dcbf6f8e6264"} Dec 04 01:24:04 crc kubenswrapper[4764]: I1204 01:24:04.861990 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 01:24:06 crc kubenswrapper[4764]: I1204 01:24:06.546001 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:24:06 crc kubenswrapper[4764]: E1204 01:24:06.546912 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:24:11 crc kubenswrapper[4764]: I1204 01:24:11.254363 4764 generic.go:334] "Generic (PLEG): container finished" podID="97e1e583-be68-470d-a9fd-b8bc27831cd4" containerID="c2343754bdbaec6cb5a03e4dfdcc123e322c02fda3fa26f2c239284203f2d02c" exitCode=0 Dec 04 01:24:11 crc kubenswrapper[4764]: I1204 01:24:11.254784 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"97e1e583-be68-470d-a9fd-b8bc27831cd4","Type":"ContainerDied","Data":"c2343754bdbaec6cb5a03e4dfdcc123e322c02fda3fa26f2c239284203f2d02c"} Dec 04 01:24:12 crc kubenswrapper[4764]: I1204 01:24:12.273101 4764 generic.go:334] "Generic (PLEG): container finished" podID="91dceff4-dc61-4803-98fb-da530493e50c" containerID="96926586d3a93854334d43c71eba1ef4122507e25266ecc54336dcbf6f8e6264" exitCode=0 Dec 04 01:24:12 crc kubenswrapper[4764]: I1204 01:24:12.273174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerDied","Data":"96926586d3a93854334d43c71eba1ef4122507e25266ecc54336dcbf6f8e6264"} Dec 04 01:24:13 crc kubenswrapper[4764]: I1204 01:24:13.462060 4764 scope.go:117] "RemoveContainer" containerID="b261d11aa6316127130d3b54b69517d2eb7813fa16e9ce8b367b96a04c3462ed" Dec 04 01:24:13 crc kubenswrapper[4764]: I1204 01:24:13.514375 4764 scope.go:117] "RemoveContainer" containerID="c9abad1422dc9ad9e075587f1ca11853ab13a419b0379735f34274780f74e6ba" Dec 04 01:24:13 crc kubenswrapper[4764]: I1204 01:24:13.629560 4764 scope.go:117] "RemoveContainer" containerID="f0d36fc378dec5fcca5b3624ff3e72903cfc690d0a49c4b4448161f64850cb8c" Dec 04 01:24:14 crc kubenswrapper[4764]: I1204 01:24:14.302673 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"97e1e583-be68-470d-a9fd-b8bc27831cd4","Type":"ContainerStarted","Data":"4664b24325c59b95c679d20ce836719e0b8cdf1852bb625187d0e93291d68738"} Dec 04 01:24:17 crc kubenswrapper[4764]: I1204 01:24:17.333006 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"97e1e583-be68-470d-a9fd-b8bc27831cd4","Type":"ContainerStarted","Data":"5c9b0a7041b039df9c8c7f52fb4e5a3229edaa052ea525509f92a2625145b407"} Dec 04 01:24:17 crc kubenswrapper[4764]: I1204 01:24:17.334841 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 04 01:24:17 crc kubenswrapper[4764]: I1204 01:24:17.336403 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 04 01:24:17 crc kubenswrapper[4764]: I1204 01:24:17.357142 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.437654936 podStartE2EDuration="22.357125521s" podCreationTimestamp="2025-12-04 01:23:55 +0000 UTC" firstStartedPulling="2025-12-04 01:23:56.599200757 +0000 UTC m=+6172.360525158" lastFinishedPulling="2025-12-04 01:24:13.518671332 +0000 UTC m=+6189.279995743" observedRunningTime="2025-12-04 01:24:17.352767494 +0000 UTC m=+6193.114091935" watchObservedRunningTime="2025-12-04 01:24:17.357125521 +0000 UTC m=+6193.118449932" Dec 04 01:24:19 crc kubenswrapper[4764]: I1204 01:24:19.545750 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:24:19 crc kubenswrapper[4764]: E1204 01:24:19.546535 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:24:20 crc kubenswrapper[4764]: I1204 01:24:20.367240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerStarted","Data":"7ccd61bc85b8c97359f156cf16062268f769e39951918b3e34bec8fd2f5fe4eb"} Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.100015 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-drt66"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.123198 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qdtmh"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.137115 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-drt66"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.151284 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-937f-account-create-update-k224b"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.162312 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cccc-account-create-update-2rbt2"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.170651 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7cb3-account-create-update-k4bwd"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.177824 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-xpp64"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.184913 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qdtmh"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.191705 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-937f-account-create-update-k224b"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.200500 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cccc-account-create-update-2rbt2"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.208335 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7cb3-account-create-update-k4bwd"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.215486 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-xpp64"] Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.561449 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015a0994-0aee-4ed7-a588-ed74a8cb5db3" path="/var/lib/kubelet/pods/015a0994-0aee-4ed7-a588-ed74a8cb5db3/volumes" Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.562619 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e40f27-93e1-44ed-8887-e7ba539f4270" path="/var/lib/kubelet/pods/07e40f27-93e1-44ed-8887-e7ba539f4270/volumes" Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.563601 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33377c8a-9ea1-440b-aac0-8492e36ae5d3" path="/var/lib/kubelet/pods/33377c8a-9ea1-440b-aac0-8492e36ae5d3/volumes" Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.564629 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386aa79c-4a08-4154-896d-f7be1f444951" path="/var/lib/kubelet/pods/386aa79c-4a08-4154-896d-f7be1f444951/volumes" Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.566118 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4" path="/var/lib/kubelet/pods/67c7be3d-f739-4ef4-9a96-e5ab3c3fb3b4/volumes" Dec 04 01:24:22 crc kubenswrapper[4764]: I1204 01:24:22.567595 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7003749-8986-4904-acbf-32390dab7600" path="/var/lib/kubelet/pods/a7003749-8986-4904-acbf-32390dab7600/volumes" Dec 04 01:24:25 crc kubenswrapper[4764]: I1204 01:24:25.457348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerStarted","Data":"c875e836db8afae84a2206e3a266c914e7a8e885a70e68ebbf098bbeb9887d01"} Dec 04 01:24:28 crc kubenswrapper[4764]: I1204 01:24:28.491095 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"91dceff4-dc61-4803-98fb-da530493e50c","Type":"ContainerStarted","Data":"0e6a692d77d2444fcebb8115a44114c49f041c3ec0cdf42283d50d33eec08655"} Dec 04 01:24:28 crc kubenswrapper[4764]: I1204 01:24:28.530684 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.420609624 podStartE2EDuration="34.530662023s" podCreationTimestamp="2025-12-04 01:23:54 +0000 UTC" firstStartedPulling="2025-12-04 01:23:57.200352735 +0000 UTC m=+6172.961677146" lastFinishedPulling="2025-12-04 01:24:27.310405134 +0000 UTC m=+6203.071729545" observedRunningTime="2025-12-04 01:24:28.518876873 +0000 UTC m=+6204.280201284" watchObservedRunningTime="2025-12-04 01:24:28.530662023 +0000 UTC m=+6204.291986434" Dec 04 01:24:31 crc kubenswrapper[4764]: I1204 01:24:31.528933 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 01:24:32 crc kubenswrapper[4764]: I1204 01:24:32.036923 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n9cc8"] Dec 04 01:24:32 crc kubenswrapper[4764]: I1204 01:24:32.049971 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-n9cc8"] Dec 04 01:24:32 crc kubenswrapper[4764]: I1204 01:24:32.560333 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6db3b0c-354c-430a-aa30-9c1a14a3c540" path="/var/lib/kubelet/pods/b6db3b0c-354c-430a-aa30-9c1a14a3c540/volumes" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.545657 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:24:33 crc kubenswrapper[4764]: E1204 01:24:33.545965 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.608460 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:24:33 crc kubenswrapper[4764]: E1204 01:24:33.609923 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="extract-utilities" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.610030 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="extract-utilities" Dec 04 01:24:33 crc kubenswrapper[4764]: E1204 01:24:33.610136 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="extract-content" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.610214 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="extract-content" Dec 04 01:24:33 crc kubenswrapper[4764]: E1204 01:24:33.610321 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="registry-server" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.610550 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="registry-server" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.610908 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dcee5ca-c8d1-4695-8c5f-0769ed7c28ed" containerName="registry-server" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.613749 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.618001 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.618282 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.625795 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703215 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703307 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703388 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703440 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703579 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.703671 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgpp2\" (UniqueName: \"kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805068 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805196 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805246 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgpp2\" (UniqueName: \"kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805306 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805506 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805880 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.805903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.821376 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.822282 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.830467 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.834862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.841403 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgpp2\" (UniqueName: \"kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2\") pod \"ceilometer-0\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " pod="openstack/ceilometer-0" Dec 04 01:24:33 crc kubenswrapper[4764]: I1204 01:24:33.950067 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:24:34 crc kubenswrapper[4764]: W1204 01:24:34.444416 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8096940b_b4f5_4998_9eec_4a74fed6d469.slice/crio-9aad5faab4e25e555ba0e7d17c4374f281b415c71e8df3076b56c9c93e7aa513 WatchSource:0}: Error finding container 9aad5faab4e25e555ba0e7d17c4374f281b415c71e8df3076b56c9c93e7aa513: Status 404 returned error can't find the container with id 9aad5faab4e25e555ba0e7d17c4374f281b415c71e8df3076b56c9c93e7aa513 Dec 04 01:24:34 crc kubenswrapper[4764]: I1204 01:24:34.456825 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:24:34 crc kubenswrapper[4764]: I1204 01:24:34.563611 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerStarted","Data":"9aad5faab4e25e555ba0e7d17c4374f281b415c71e8df3076b56c9c93e7aa513"} Dec 04 01:24:35 crc kubenswrapper[4764]: I1204 01:24:35.574248 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerStarted","Data":"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6"} Dec 04 01:24:36 crc kubenswrapper[4764]: I1204 01:24:36.587882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerStarted","Data":"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd"} Dec 04 01:24:37 crc kubenswrapper[4764]: I1204 01:24:37.597086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerStarted","Data":"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae"} Dec 04 01:24:39 crc kubenswrapper[4764]: I1204 01:24:39.620690 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerStarted","Data":"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2"} Dec 04 01:24:39 crc kubenswrapper[4764]: I1204 01:24:39.621225 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 01:24:39 crc kubenswrapper[4764]: I1204 01:24:39.664867 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.610466061 podStartE2EDuration="6.664848375s" podCreationTimestamp="2025-12-04 01:24:33 +0000 UTC" firstStartedPulling="2025-12-04 01:24:34.447258697 +0000 UTC m=+6210.208583128" lastFinishedPulling="2025-12-04 01:24:38.501641031 +0000 UTC m=+6214.262965442" observedRunningTime="2025-12-04 01:24:39.657547146 +0000 UTC m=+6215.418871557" watchObservedRunningTime="2025-12-04 01:24:39.664848375 +0000 UTC m=+6215.426172786" Dec 04 01:24:41 crc kubenswrapper[4764]: I1204 01:24:41.529187 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 01:24:41 crc kubenswrapper[4764]: I1204 01:24:41.532698 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 01:24:41 crc kubenswrapper[4764]: I1204 01:24:41.655759 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.502085 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-84nsz"] Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.505684 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.510871 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-84nsz"] Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.546572 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:24:46 crc kubenswrapper[4764]: E1204 01:24:46.546943 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.582020 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9pfx\" (UniqueName: \"kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.582501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.602333 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-5228-account-create-update-wvbk8"] Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.603646 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.608512 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.616095 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5228-account-create-update-wvbk8"] Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.684792 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.684875 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtkd\" (UniqueName: \"kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.684950 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.685040 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9pfx\" (UniqueName: \"kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.685936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.707994 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9pfx\" (UniqueName: \"kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx\") pod \"aodh-db-create-84nsz\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.786751 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtkd\" (UniqueName: \"kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.787204 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.788028 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.803417 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtkd\" (UniqueName: \"kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd\") pod \"aodh-5228-account-create-update-wvbk8\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.839738 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:46 crc kubenswrapper[4764]: I1204 01:24:46.930692 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.344380 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-84nsz"] Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.493466 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5228-account-create-update-wvbk8"] Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.711618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-84nsz" event={"ID":"115f6f75-7a37-4896-881f-f937b03faf91","Type":"ContainerStarted","Data":"530c24ba22ae9a5a240e7bb5e4eb292cfccf4d731adb333f7c51a23f6a673ce0"} Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.711916 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-84nsz" event={"ID":"115f6f75-7a37-4896-881f-f937b03faf91","Type":"ContainerStarted","Data":"a57c3eaf318439c9b339146b1c91cc7e26e062831dc4663662050fd3fdc277f5"} Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.714576 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5228-account-create-update-wvbk8" event={"ID":"aa72ee03-bea2-443c-babc-8a4319e2fc39","Type":"ContainerStarted","Data":"0bc9bac836e74dd1e5255df725e4876f27eae97676c9890cf6440fb685df72dc"} Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.714597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5228-account-create-update-wvbk8" event={"ID":"aa72ee03-bea2-443c-babc-8a4319e2fc39","Type":"ContainerStarted","Data":"32e37cbbd36ac9992d18ca181be79a9b61d95630d99d2bb1a176cee658a0b42b"} Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.756813 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-5228-account-create-update-wvbk8" podStartSLOduration=1.756786389 podStartE2EDuration="1.756786389s" podCreationTimestamp="2025-12-04 01:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:24:47.751582421 +0000 UTC m=+6223.512906832" watchObservedRunningTime="2025-12-04 01:24:47.756786389 +0000 UTC m=+6223.518110820" Dec 04 01:24:47 crc kubenswrapper[4764]: I1204 01:24:47.759781 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-84nsz" podStartSLOduration=1.759771383 podStartE2EDuration="1.759771383s" podCreationTimestamp="2025-12-04 01:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:24:47.732919322 +0000 UTC m=+6223.494243733" watchObservedRunningTime="2025-12-04 01:24:47.759771383 +0000 UTC m=+6223.521095804" Dec 04 01:24:48 crc kubenswrapper[4764]: I1204 01:24:48.734511 4764 generic.go:334] "Generic (PLEG): container finished" podID="115f6f75-7a37-4896-881f-f937b03faf91" containerID="530c24ba22ae9a5a240e7bb5e4eb292cfccf4d731adb333f7c51a23f6a673ce0" exitCode=0 Dec 04 01:24:48 crc kubenswrapper[4764]: I1204 01:24:48.734588 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-84nsz" event={"ID":"115f6f75-7a37-4896-881f-f937b03faf91","Type":"ContainerDied","Data":"530c24ba22ae9a5a240e7bb5e4eb292cfccf4d731adb333f7c51a23f6a673ce0"} Dec 04 01:24:48 crc kubenswrapper[4764]: I1204 01:24:48.738181 4764 generic.go:334] "Generic (PLEG): container finished" podID="aa72ee03-bea2-443c-babc-8a4319e2fc39" containerID="0bc9bac836e74dd1e5255df725e4876f27eae97676c9890cf6440fb685df72dc" exitCode=0 Dec 04 01:24:48 crc kubenswrapper[4764]: I1204 01:24:48.738243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5228-account-create-update-wvbk8" event={"ID":"aa72ee03-bea2-443c-babc-8a4319e2fc39","Type":"ContainerDied","Data":"0bc9bac836e74dd1e5255df725e4876f27eae97676c9890cf6440fb685df72dc"} Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.063758 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6cpft"] Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.075889 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-6cpft"] Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.329918 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.344262 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.466217 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9pfx\" (UniqueName: \"kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx\") pod \"115f6f75-7a37-4896-881f-f937b03faf91\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.466287 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtkd\" (UniqueName: \"kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd\") pod \"aa72ee03-bea2-443c-babc-8a4319e2fc39\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.466333 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts\") pod \"aa72ee03-bea2-443c-babc-8a4319e2fc39\" (UID: \"aa72ee03-bea2-443c-babc-8a4319e2fc39\") " Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.466404 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts\") pod \"115f6f75-7a37-4896-881f-f937b03faf91\" (UID: \"115f6f75-7a37-4896-881f-f937b03faf91\") " Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.467236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa72ee03-bea2-443c-babc-8a4319e2fc39" (UID: "aa72ee03-bea2-443c-babc-8a4319e2fc39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.467405 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "115f6f75-7a37-4896-881f-f937b03faf91" (UID: "115f6f75-7a37-4896-881f-f937b03faf91"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.468283 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/115f6f75-7a37-4896-881f-f937b03faf91-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.468316 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa72ee03-bea2-443c-babc-8a4319e2fc39-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.471970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx" (OuterVolumeSpecName: "kube-api-access-q9pfx") pod "115f6f75-7a37-4896-881f-f937b03faf91" (UID: "115f6f75-7a37-4896-881f-f937b03faf91"). InnerVolumeSpecName "kube-api-access-q9pfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.474810 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd" (OuterVolumeSpecName: "kube-api-access-7jtkd") pod "aa72ee03-bea2-443c-babc-8a4319e2fc39" (UID: "aa72ee03-bea2-443c-babc-8a4319e2fc39"). InnerVolumeSpecName "kube-api-access-7jtkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.560141 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fdc2149-703a-4038-9b06-75f9c5ef5f21" path="/var/lib/kubelet/pods/5fdc2149-703a-4038-9b06-75f9c5ef5f21/volumes" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.569969 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9pfx\" (UniqueName: \"kubernetes.io/projected/115f6f75-7a37-4896-881f-f937b03faf91-kube-api-access-q9pfx\") on node \"crc\" DevicePath \"\"" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.570002 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtkd\" (UniqueName: \"kubernetes.io/projected/aa72ee03-bea2-443c-babc-8a4319e2fc39-kube-api-access-7jtkd\") on node \"crc\" DevicePath \"\"" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.760315 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5228-account-create-update-wvbk8" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.760305 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5228-account-create-update-wvbk8" event={"ID":"aa72ee03-bea2-443c-babc-8a4319e2fc39","Type":"ContainerDied","Data":"32e37cbbd36ac9992d18ca181be79a9b61d95630d99d2bb1a176cee658a0b42b"} Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.760470 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e37cbbd36ac9992d18ca181be79a9b61d95630d99d2bb1a176cee658a0b42b" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.763250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-84nsz" event={"ID":"115f6f75-7a37-4896-881f-f937b03faf91","Type":"ContainerDied","Data":"a57c3eaf318439c9b339146b1c91cc7e26e062831dc4663662050fd3fdc277f5"} Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.763302 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a57c3eaf318439c9b339146b1c91cc7e26e062831dc4663662050fd3fdc277f5" Dec 04 01:24:50 crc kubenswrapper[4764]: I1204 01:24:50.763322 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-84nsz" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.026201 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-zldpr"] Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.034956 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-zldpr"] Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.952098 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8s2jj"] Dec 04 01:24:51 crc kubenswrapper[4764]: E1204 01:24:51.953317 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115f6f75-7a37-4896-881f-f937b03faf91" containerName="mariadb-database-create" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.953350 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="115f6f75-7a37-4896-881f-f937b03faf91" containerName="mariadb-database-create" Dec 04 01:24:51 crc kubenswrapper[4764]: E1204 01:24:51.953406 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa72ee03-bea2-443c-babc-8a4319e2fc39" containerName="mariadb-account-create-update" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.953414 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa72ee03-bea2-443c-babc-8a4319e2fc39" containerName="mariadb-account-create-update" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.953609 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="115f6f75-7a37-4896-881f-f937b03faf91" containerName="mariadb-database-create" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.953630 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa72ee03-bea2-443c-babc-8a4319e2fc39" containerName="mariadb-account-create-update" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.954506 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.963422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8s2jj"] Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.974498 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.974750 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.974969 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2r4rq" Dec 04 01:24:51 crc kubenswrapper[4764]: I1204 01:24:51.975139 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.105091 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.105204 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.105233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwd7s\" (UniqueName: \"kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.105299 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.207649 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwd7s\" (UniqueName: \"kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.208035 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.208142 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.208247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.213006 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.214546 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.218454 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.228577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwd7s\" (UniqueName: \"kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s\") pod \"aodh-db-sync-8s2jj\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.286146 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.556698 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf4de2e-ddef-407f-bc65-e40c822a8a93" path="/var/lib/kubelet/pods/adf4de2e-ddef-407f-bc65-e40c822a8a93/volumes" Dec 04 01:24:52 crc kubenswrapper[4764]: W1204 01:24:52.770770 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1273966b_f3cd_4958_b6e0_9d83312bdec7.slice/crio-8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa WatchSource:0}: Error finding container 8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa: Status 404 returned error can't find the container with id 8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.791547 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8s2jj"] Dec 04 01:24:52 crc kubenswrapper[4764]: I1204 01:24:52.795201 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8s2jj" event={"ID":"1273966b-f3cd-4958-b6e0-9d83312bdec7","Type":"ContainerStarted","Data":"8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa"} Dec 04 01:24:57 crc kubenswrapper[4764]: I1204 01:24:57.546427 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:24:57 crc kubenswrapper[4764]: E1204 01:24:57.547603 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:24:57 crc kubenswrapper[4764]: I1204 01:24:57.860486 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8s2jj" event={"ID":"1273966b-f3cd-4958-b6e0-9d83312bdec7","Type":"ContainerStarted","Data":"a750f214d3eb5894856b448c3c71b34aec87d9766ab9bd653437ae1c9c2caeb8"} Dec 04 01:24:57 crc kubenswrapper[4764]: I1204 01:24:57.887443 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8s2jj" podStartSLOduration=2.532320821 podStartE2EDuration="6.887414518s" podCreationTimestamp="2025-12-04 01:24:51 +0000 UTC" firstStartedPulling="2025-12-04 01:24:52.77302526 +0000 UTC m=+6228.534349661" lastFinishedPulling="2025-12-04 01:24:57.128118927 +0000 UTC m=+6232.889443358" observedRunningTime="2025-12-04 01:24:57.877126215 +0000 UTC m=+6233.638450656" watchObservedRunningTime="2025-12-04 01:24:57.887414518 +0000 UTC m=+6233.648738939" Dec 04 01:24:59 crc kubenswrapper[4764]: I1204 01:24:59.906523 4764 generic.go:334] "Generic (PLEG): container finished" podID="1273966b-f3cd-4958-b6e0-9d83312bdec7" containerID="a750f214d3eb5894856b448c3c71b34aec87d9766ab9bd653437ae1c9c2caeb8" exitCode=0 Dec 04 01:24:59 crc kubenswrapper[4764]: I1204 01:24:59.906632 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8s2jj" event={"ID":"1273966b-f3cd-4958-b6e0-9d83312bdec7","Type":"ContainerDied","Data":"a750f214d3eb5894856b448c3c71b34aec87d9766ab9bd653437ae1c9c2caeb8"} Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.388843 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.533774 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts\") pod \"1273966b-f3cd-4958-b6e0-9d83312bdec7\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.533919 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data\") pod \"1273966b-f3cd-4958-b6e0-9d83312bdec7\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.534044 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwd7s\" (UniqueName: \"kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s\") pod \"1273966b-f3cd-4958-b6e0-9d83312bdec7\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.534436 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle\") pod \"1273966b-f3cd-4958-b6e0-9d83312bdec7\" (UID: \"1273966b-f3cd-4958-b6e0-9d83312bdec7\") " Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.542543 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts" (OuterVolumeSpecName: "scripts") pod "1273966b-f3cd-4958-b6e0-9d83312bdec7" (UID: "1273966b-f3cd-4958-b6e0-9d83312bdec7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.543024 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s" (OuterVolumeSpecName: "kube-api-access-jwd7s") pod "1273966b-f3cd-4958-b6e0-9d83312bdec7" (UID: "1273966b-f3cd-4958-b6e0-9d83312bdec7"). InnerVolumeSpecName "kube-api-access-jwd7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.565102 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data" (OuterVolumeSpecName: "config-data") pod "1273966b-f3cd-4958-b6e0-9d83312bdec7" (UID: "1273966b-f3cd-4958-b6e0-9d83312bdec7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.570233 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1273966b-f3cd-4958-b6e0-9d83312bdec7" (UID: "1273966b-f3cd-4958-b6e0-9d83312bdec7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.637800 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.637858 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.637878 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1273966b-f3cd-4958-b6e0-9d83312bdec7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.637896 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwd7s\" (UniqueName: \"kubernetes.io/projected/1273966b-f3cd-4958-b6e0-9d83312bdec7-kube-api-access-jwd7s\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.954382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8s2jj" event={"ID":"1273966b-f3cd-4958-b6e0-9d83312bdec7","Type":"ContainerDied","Data":"8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa"} Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.954453 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f51c11b77648adc9be4176c5f474c591cf13c42a33aa3426fd4591343fed0fa" Dec 04 01:25:01 crc kubenswrapper[4764]: I1204 01:25:01.955285 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8s2jj" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.128110 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 04 01:25:02 crc kubenswrapper[4764]: E1204 01:25:02.128627 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1273966b-f3cd-4958-b6e0-9d83312bdec7" containerName="aodh-db-sync" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.128645 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1273966b-f3cd-4958-b6e0-9d83312bdec7" containerName="aodh-db-sync" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.128857 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1273966b-f3cd-4958-b6e0-9d83312bdec7" containerName="aodh-db-sync" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.130691 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.138098 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.138497 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-2r4rq" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.138881 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.142631 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.261014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4hsb\" (UniqueName: \"kubernetes.io/projected/8af18f38-79b2-4deb-b9ff-11aa9eccf479-kube-api-access-r4hsb\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.261060 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.261193 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-config-data\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.261255 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-scripts\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.362560 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-config-data\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.362982 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-scripts\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.363094 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4hsb\" (UniqueName: \"kubernetes.io/projected/8af18f38-79b2-4deb-b9ff-11aa9eccf479-kube-api-access-r4hsb\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.363124 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.367812 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-config-data\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.370612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-combined-ca-bundle\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.380911 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af18f38-79b2-4deb-b9ff-11aa9eccf479-scripts\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.396026 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4hsb\" (UniqueName: \"kubernetes.io/projected/8af18f38-79b2-4deb-b9ff-11aa9eccf479-kube-api-access-r4hsb\") pod \"aodh-0\" (UID: \"8af18f38-79b2-4deb-b9ff-11aa9eccf479\") " pod="openstack/aodh-0" Dec 04 01:25:02 crc kubenswrapper[4764]: I1204 01:25:02.464083 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:02.999797 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 04 01:25:03 crc kubenswrapper[4764]: W1204 01:25:03.004900 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8af18f38_79b2_4deb_b9ff_11aa9eccf479.slice/crio-40da1e8524bd84c1b22c57aea915d5ae92a433ad958450df4ba2b5ab94f197d8 WatchSource:0}: Error finding container 40da1e8524bd84c1b22c57aea915d5ae92a433ad958450df4ba2b5ab94f197d8: Status 404 returned error can't find the container with id 40da1e8524bd84c1b22c57aea915d5ae92a433ad958450df4ba2b5ab94f197d8 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.586116 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.586650 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-central-agent" containerID="cri-o://ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6" gracePeriod=30 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.586785 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="proxy-httpd" containerID="cri-o://ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2" gracePeriod=30 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.586822 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="sg-core" containerID="cri-o://c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae" gracePeriod=30 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.586856 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-notification-agent" containerID="cri-o://3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd" gracePeriod=30 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.597989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.951188 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.134:3000/\": dial tcp 10.217.1.134:3000: connect: connection refused" Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.972657 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8af18f38-79b2-4deb-b9ff-11aa9eccf479","Type":"ContainerStarted","Data":"8bf8850021d2ce75e81a2174dfad065e048920597f6649455b21680144af0cea"} Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.972700 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8af18f38-79b2-4deb-b9ff-11aa9eccf479","Type":"ContainerStarted","Data":"40da1e8524bd84c1b22c57aea915d5ae92a433ad958450df4ba2b5ab94f197d8"} Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.975762 4764 generic.go:334] "Generic (PLEG): container finished" podID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerID="ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2" exitCode=0 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.975790 4764 generic.go:334] "Generic (PLEG): container finished" podID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerID="c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae" exitCode=2 Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.975811 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerDied","Data":"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2"} Dec 04 01:25:03 crc kubenswrapper[4764]: I1204 01:25:03.975839 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerDied","Data":"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae"} Dec 04 01:25:04 crc kubenswrapper[4764]: I1204 01:25:04.997611 4764 generic.go:334] "Generic (PLEG): container finished" podID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerID="ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6" exitCode=0 Dec 04 01:25:04 crc kubenswrapper[4764]: I1204 01:25:04.997988 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerDied","Data":"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6"} Dec 04 01:25:06 crc kubenswrapper[4764]: I1204 01:25:06.016048 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8af18f38-79b2-4deb-b9ff-11aa9eccf479","Type":"ContainerStarted","Data":"b75b9e365b4f2dd91dd850d74d012f45f5384f9326653001dce985d21d81ee1f"} Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.030664 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8af18f38-79b2-4deb-b9ff-11aa9eccf479","Type":"ContainerStarted","Data":"ebf9fbf37e960d8f39961b9ce5aab9346bd5e32f389e9f1f5fac74eaacdfb92d"} Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.837931 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986473 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986503 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986548 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgpp2\" (UniqueName: \"kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986629 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.986671 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data\") pod \"8096940b-b4f5-4998-9eec-4a74fed6d469\" (UID: \"8096940b-b4f5-4998-9eec-4a74fed6d469\") " Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.988352 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.989514 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.993830 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts" (OuterVolumeSpecName: "scripts") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:07 crc kubenswrapper[4764]: I1204 01:25:07.994131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2" (OuterVolumeSpecName: "kube-api-access-dgpp2") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "kube-api-access-dgpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.017461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.048225 4764 generic.go:334] "Generic (PLEG): container finished" podID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerID="3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd" exitCode=0 Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.048267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerDied","Data":"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd"} Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.048293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8096940b-b4f5-4998-9eec-4a74fed6d469","Type":"ContainerDied","Data":"9aad5faab4e25e555ba0e7d17c4374f281b415c71e8df3076b56c9c93e7aa513"} Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.048310 4764 scope.go:117] "RemoveContainer" containerID="ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.048435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.088861 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.088894 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.088905 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8096940b-b4f5-4998-9eec-4a74fed6d469-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.088914 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgpp2\" (UniqueName: \"kubernetes.io/projected/8096940b-b4f5-4998-9eec-4a74fed6d469-kube-api-access-dgpp2\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.088940 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.091225 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.126907 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data" (OuterVolumeSpecName: "config-data") pod "8096940b-b4f5-4998-9eec-4a74fed6d469" (UID: "8096940b-b4f5-4998-9eec-4a74fed6d469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.159226 4764 scope.go:117] "RemoveContainer" containerID="c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.193996 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.194075 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8096940b-b4f5-4998-9eec-4a74fed6d469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.202773 4764 scope.go:117] "RemoveContainer" containerID="3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.230841 4764 scope.go:117] "RemoveContainer" containerID="ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.261237 4764 scope.go:117] "RemoveContainer" containerID="ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.261634 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2\": container with ID starting with ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2 not found: ID does not exist" containerID="ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.261706 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2"} err="failed to get container status \"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2\": rpc error: code = NotFound desc = could not find container \"ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2\": container with ID starting with ae67438c0e58e4d3678624964c49d985a3438e47fe1233bdcd38b2dbace5c0b2 not found: ID does not exist" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.261777 4764 scope.go:117] "RemoveContainer" containerID="c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.262347 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae\": container with ID starting with c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae not found: ID does not exist" containerID="c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.262382 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae"} err="failed to get container status \"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae\": rpc error: code = NotFound desc = could not find container \"c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae\": container with ID starting with c5cdbc23ac94fa842fa2e863c91d2dacdaa2e569681cee4ffa472620487983ae not found: ID does not exist" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.262404 4764 scope.go:117] "RemoveContainer" containerID="3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.262818 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd\": container with ID starting with 3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd not found: ID does not exist" containerID="3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.262864 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd"} err="failed to get container status \"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd\": rpc error: code = NotFound desc = could not find container \"3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd\": container with ID starting with 3a5849baa7bea13ac5cce0a24ad995bd699b1d3c3a98a6592186b9905384b3fd not found: ID does not exist" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.262895 4764 scope.go:117] "RemoveContainer" containerID="ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.263267 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6\": container with ID starting with ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6 not found: ID does not exist" containerID="ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.263307 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6"} err="failed to get container status \"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6\": rpc error: code = NotFound desc = could not find container \"ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6\": container with ID starting with ba0cc71f767e3e7cb8060759dec1855fc9c9e5079687764cc150cb8533f04ae6 not found: ID does not exist" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.409217 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.434373 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.448534 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.449007 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-central-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449025 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-central-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.449050 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="sg-core" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449056 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="sg-core" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.449071 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-notification-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449077 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-notification-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: E1204 01:25:08.449091 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="proxy-httpd" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449096 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="proxy-httpd" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449293 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-notification-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449317 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="sg-core" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449323 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="proxy-httpd" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.449335 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" containerName="ceilometer-central-agent" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.451152 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.456308 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.456495 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.461462 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.563096 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8096940b-b4f5-4998-9eec-4a74fed6d469" path="/var/lib/kubelet/pods/8096940b-b4f5-4998-9eec-4a74fed6d469/volumes" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.601950 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602051 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602096 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602109 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602128 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zp89\" (UniqueName: \"kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.602223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704157 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704253 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704287 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704325 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zp89\" (UniqueName: \"kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704398 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.704466 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.707273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.707497 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.710906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.711323 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.711493 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.711618 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.731598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zp89\" (UniqueName: \"kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89\") pod \"ceilometer-0\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " pod="openstack/ceilometer-0" Dec 04 01:25:08 crc kubenswrapper[4764]: I1204 01:25:08.809409 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.051575 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pdqmj"] Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.065367 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pdqmj"] Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.067514 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"8af18f38-79b2-4deb-b9ff-11aa9eccf479","Type":"ContainerStarted","Data":"d6eea558af4848923dccf649a638a614e4f824a9222d41a38db0af199d858ed6"} Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.092052 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.872715275 podStartE2EDuration="7.092022885s" podCreationTimestamp="2025-12-04 01:25:02 +0000 UTC" firstStartedPulling="2025-12-04 01:25:03.009126376 +0000 UTC m=+6238.770450787" lastFinishedPulling="2025-12-04 01:25:08.228433986 +0000 UTC m=+6243.989758397" observedRunningTime="2025-12-04 01:25:09.091253066 +0000 UTC m=+6244.852577477" watchObservedRunningTime="2025-12-04 01:25:09.092022885 +0000 UTC m=+6244.853347296" Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.304568 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:09 crc kubenswrapper[4764]: I1204 01:25:09.546735 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:25:09 crc kubenswrapper[4764]: E1204 01:25:09.547048 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:25:10 crc kubenswrapper[4764]: I1204 01:25:10.085371 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerStarted","Data":"ee5e227bfdecbc2e04e1736aacac38721fe698bf6d703eb2c8b3733736b87495"} Dec 04 01:25:10 crc kubenswrapper[4764]: I1204 01:25:10.085828 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerStarted","Data":"a81df948180e45c21bc3e952fc3ad76955eaf833b5e173b995cf277baf6f57cb"} Dec 04 01:25:10 crc kubenswrapper[4764]: I1204 01:25:10.558860 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4989134-2aa5-4637-8256-6e5557b657e6" path="/var/lib/kubelet/pods/f4989134-2aa5-4637-8256-6e5557b657e6/volumes" Dec 04 01:25:12 crc kubenswrapper[4764]: I1204 01:25:12.131997 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerStarted","Data":"d6f64d9129f0da2fbaed7bcfc7ae498d93f0ffb3a19d68ad4491ee4bac16d0f7"} Dec 04 01:25:13 crc kubenswrapper[4764]: I1204 01:25:13.158131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerStarted","Data":"2d1ee2d0255ef7fad12415a428ea5ab3d5ab3bc8ef22c5e73e2b933adcc1bd9c"} Dec 04 01:25:13 crc kubenswrapper[4764]: I1204 01:25:13.826431 4764 scope.go:117] "RemoveContainer" containerID="60160ebe62b3473a9a59940969d4c0ca7d19afad430ba5cb9613978c206d5402" Dec 04 01:25:13 crc kubenswrapper[4764]: I1204 01:25:13.860632 4764 scope.go:117] "RemoveContainer" containerID="f58cc76dfb1041d9c807ee6dd726ef98574cde48e0cc21f8e4e7658fab04785d" Dec 04 01:25:13 crc kubenswrapper[4764]: I1204 01:25:13.952829 4764 scope.go:117] "RemoveContainer" containerID="8fafaaf760b2c10aa3408f55dde7cdee94f47ee862e0e7bb54ee73040a297220" Dec 04 01:25:13 crc kubenswrapper[4764]: I1204 01:25:13.971952 4764 scope.go:117] "RemoveContainer" containerID="6efddbb211d9d38a885d9ae01978c9959a810d165236491985b62a83dffd366d" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.020703 4764 scope.go:117] "RemoveContainer" containerID="7474feb2c708dcc2dacb2db55823dbab40d98e321efee67abd49022edbcce363" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.048017 4764 scope.go:117] "RemoveContainer" containerID="5347af05e7926ebac84bc843569b67c8ace22df2bd1085ed302ee5efd82ee24b" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.075512 4764 scope.go:117] "RemoveContainer" containerID="2197dd50300f9109e0937455d333d6ed1bd8d75784bda88be80d7a553356c765" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.114395 4764 scope.go:117] "RemoveContainer" containerID="bbe039d126df8b31efc3b1b4e3b157753e80b5024142710df9c3f1459445e511" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.145647 4764 scope.go:117] "RemoveContainer" containerID="d1641ec5aa57f03d35c514443cdbca3c3414a019b023d7d63f9cf5f82c174245" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.198517 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerStarted","Data":"cf29d6ae6d16c38f90e772a95eaf9cf0f4f4ed6a2c49a23f65c44ef8d4f4dad6"} Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.199989 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.299397 4764 scope.go:117] "RemoveContainer" containerID="705b2350661e0762320380e7d505a3762070a7cf9a53e7ec08c3c3910a996d5a" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.601057 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7890307180000002 podStartE2EDuration="6.601036506s" podCreationTimestamp="2025-12-04 01:25:08 +0000 UTC" firstStartedPulling="2025-12-04 01:25:09.300430185 +0000 UTC m=+6245.061754596" lastFinishedPulling="2025-12-04 01:25:13.112435963 +0000 UTC m=+6248.873760384" observedRunningTime="2025-12-04 01:25:14.248093808 +0000 UTC m=+6250.009418219" watchObservedRunningTime="2025-12-04 01:25:14.601036506 +0000 UTC m=+6250.362360917" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.608940 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-zjp24"] Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.610583 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.620436 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zjp24"] Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.709683 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-f6b6-account-create-update-ck66f"] Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.711205 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.713333 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.733747 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-f6b6-account-create-update-ck66f"] Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.745698 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89wz\" (UniqueName: \"kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.746236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.847587 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v647\" (UniqueName: \"kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.847660 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.847756 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.847857 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d89wz\" (UniqueName: \"kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.848502 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.864411 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89wz\" (UniqueName: \"kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz\") pod \"manila-db-create-zjp24\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.942676 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zjp24" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.949228 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v647\" (UniqueName: \"kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.949316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.950348 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:14 crc kubenswrapper[4764]: I1204 01:25:14.974353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v647\" (UniqueName: \"kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647\") pod \"manila-f6b6-account-create-update-ck66f\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:15 crc kubenswrapper[4764]: I1204 01:25:15.035301 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:15 crc kubenswrapper[4764]: I1204 01:25:15.579382 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-zjp24"] Dec 04 01:25:15 crc kubenswrapper[4764]: W1204 01:25:15.602053 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod844be069_36ba_49ad_8555_567a2086a7fc.slice/crio-378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e WatchSource:0}: Error finding container 378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e: Status 404 returned error can't find the container with id 378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e Dec 04 01:25:15 crc kubenswrapper[4764]: I1204 01:25:15.785300 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-f6b6-account-create-update-ck66f"] Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.270521 4764 generic.go:334] "Generic (PLEG): container finished" podID="844be069-36ba-49ad-8555-567a2086a7fc" containerID="cf0843dcade991be0d86caf8e615dbff354c0e6c2508beb52bd02b99a22b5b0b" exitCode=0 Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.270585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zjp24" event={"ID":"844be069-36ba-49ad-8555-567a2086a7fc","Type":"ContainerDied","Data":"cf0843dcade991be0d86caf8e615dbff354c0e6c2508beb52bd02b99a22b5b0b"} Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.270614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zjp24" event={"ID":"844be069-36ba-49ad-8555-567a2086a7fc","Type":"ContainerStarted","Data":"378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e"} Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.277740 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f6b6-account-create-update-ck66f" event={"ID":"17370d13-d0d9-4908-9f90-723405035fdd","Type":"ContainerStarted","Data":"8000c7995b70966676325d1ccf65e810ecfecc4afce584b36044c5651a63b1d6"} Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.277785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f6b6-account-create-update-ck66f" event={"ID":"17370d13-d0d9-4908-9f90-723405035fdd","Type":"ContainerStarted","Data":"ad055e510e44f2c639d173a7808eacd0bb6060b0f9c74b780e0ee2c0f92d55f5"} Dec 04 01:25:16 crc kubenswrapper[4764]: I1204 01:25:16.344645 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-f6b6-account-create-update-ck66f" podStartSLOduration=2.344618307 podStartE2EDuration="2.344618307s" podCreationTimestamp="2025-12-04 01:25:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:25:16.322263197 +0000 UTC m=+6252.083587608" watchObservedRunningTime="2025-12-04 01:25:16.344618307 +0000 UTC m=+6252.105942718" Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.288263 4764 generic.go:334] "Generic (PLEG): container finished" podID="17370d13-d0d9-4908-9f90-723405035fdd" containerID="8000c7995b70966676325d1ccf65e810ecfecc4afce584b36044c5651a63b1d6" exitCode=0 Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.288315 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f6b6-account-create-update-ck66f" event={"ID":"17370d13-d0d9-4908-9f90-723405035fdd","Type":"ContainerDied","Data":"8000c7995b70966676325d1ccf65e810ecfecc4afce584b36044c5651a63b1d6"} Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.758400 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zjp24" Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.882789 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89wz\" (UniqueName: \"kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz\") pod \"844be069-36ba-49ad-8555-567a2086a7fc\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.883052 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts\") pod \"844be069-36ba-49ad-8555-567a2086a7fc\" (UID: \"844be069-36ba-49ad-8555-567a2086a7fc\") " Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.883984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "844be069-36ba-49ad-8555-567a2086a7fc" (UID: "844be069-36ba-49ad-8555-567a2086a7fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.890331 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz" (OuterVolumeSpecName: "kube-api-access-d89wz") pod "844be069-36ba-49ad-8555-567a2086a7fc" (UID: "844be069-36ba-49ad-8555-567a2086a7fc"). InnerVolumeSpecName "kube-api-access-d89wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.985982 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/844be069-36ba-49ad-8555-567a2086a7fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:17 crc kubenswrapper[4764]: I1204 01:25:17.986025 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d89wz\" (UniqueName: \"kubernetes.io/projected/844be069-36ba-49ad-8555-567a2086a7fc-kube-api-access-d89wz\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.304954 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-zjp24" event={"ID":"844be069-36ba-49ad-8555-567a2086a7fc","Type":"ContainerDied","Data":"378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e"} Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.307528 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="378c0714e6d613433e439e90651cf47ea75b350e409cb0bd2367442a02ab766e" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.305012 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-zjp24" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.763094 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.913392 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts\") pod \"17370d13-d0d9-4908-9f90-723405035fdd\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.913809 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v647\" (UniqueName: \"kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647\") pod \"17370d13-d0d9-4908-9f90-723405035fdd\" (UID: \"17370d13-d0d9-4908-9f90-723405035fdd\") " Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.914007 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17370d13-d0d9-4908-9f90-723405035fdd" (UID: "17370d13-d0d9-4908-9f90-723405035fdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.914564 4764 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17370d13-d0d9-4908-9f90-723405035fdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:18 crc kubenswrapper[4764]: I1204 01:25:18.919094 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647" (OuterVolumeSpecName: "kube-api-access-9v647") pod "17370d13-d0d9-4908-9f90-723405035fdd" (UID: "17370d13-d0d9-4908-9f90-723405035fdd"). InnerVolumeSpecName "kube-api-access-9v647". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:19 crc kubenswrapper[4764]: I1204 01:25:19.016535 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v647\" (UniqueName: \"kubernetes.io/projected/17370d13-d0d9-4908-9f90-723405035fdd-kube-api-access-9v647\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:19 crc kubenswrapper[4764]: I1204 01:25:19.317190 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-f6b6-account-create-update-ck66f" event={"ID":"17370d13-d0d9-4908-9f90-723405035fdd","Type":"ContainerDied","Data":"ad055e510e44f2c639d173a7808eacd0bb6060b0f9c74b780e0ee2c0f92d55f5"} Dec 04 01:25:19 crc kubenswrapper[4764]: I1204 01:25:19.317224 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad055e510e44f2c639d173a7808eacd0bb6060b0f9c74b780e0ee2c0f92d55f5" Dec 04 01:25:19 crc kubenswrapper[4764]: I1204 01:25:19.317236 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-f6b6-account-create-update-ck66f" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.019948 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-k6pbq"] Dec 04 01:25:20 crc kubenswrapper[4764]: E1204 01:25:20.020806 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844be069-36ba-49ad-8555-567a2086a7fc" containerName="mariadb-database-create" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.020830 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="844be069-36ba-49ad-8555-567a2086a7fc" containerName="mariadb-database-create" Dec 04 01:25:20 crc kubenswrapper[4764]: E1204 01:25:20.020851 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17370d13-d0d9-4908-9f90-723405035fdd" containerName="mariadb-account-create-update" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.020860 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17370d13-d0d9-4908-9f90-723405035fdd" containerName="mariadb-account-create-update" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.021141 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="844be069-36ba-49ad-8555-567a2086a7fc" containerName="mariadb-database-create" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.021182 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="17370d13-d0d9-4908-9f90-723405035fdd" containerName="mariadb-account-create-update" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.022124 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.024684 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qfktk" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.026101 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.043527 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k6pbq"] Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.139851 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.139959 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.140279 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjll\" (UniqueName: \"kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.140411 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.243054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.243108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.243237 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjll\" (UniqueName: \"kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.243284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.249457 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.251324 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.251873 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.262137 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjll\" (UniqueName: \"kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll\") pod \"manila-db-sync-k6pbq\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:20 crc kubenswrapper[4764]: I1204 01:25:20.344220 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:21 crc kubenswrapper[4764]: I1204 01:25:21.144444 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-k6pbq"] Dec 04 01:25:21 crc kubenswrapper[4764]: I1204 01:25:21.341934 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k6pbq" event={"ID":"82e554d3-1f76-450f-a5f5-0a350bddb83b","Type":"ContainerStarted","Data":"fd665de367eff782b8eaf1e43b4486c99a1bc99fa0f98f03dcad116fa7772146"} Dec 04 01:25:21 crc kubenswrapper[4764]: I1204 01:25:21.547066 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:25:21 crc kubenswrapper[4764]: E1204 01:25:21.547457 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:25:26 crc kubenswrapper[4764]: I1204 01:25:26.405245 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k6pbq" event={"ID":"82e554d3-1f76-450f-a5f5-0a350bddb83b","Type":"ContainerStarted","Data":"58f50321737b52e89ec250653ba7284f489e2bc53c2b2d3e749c453f03fb7bad"} Dec 04 01:25:26 crc kubenswrapper[4764]: I1204 01:25:26.438158 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-k6pbq" podStartSLOduration=3.168942451 podStartE2EDuration="7.438127702s" podCreationTimestamp="2025-12-04 01:25:19 +0000 UTC" firstStartedPulling="2025-12-04 01:25:21.151441023 +0000 UTC m=+6256.912765434" lastFinishedPulling="2025-12-04 01:25:25.420626274 +0000 UTC m=+6261.181950685" observedRunningTime="2025-12-04 01:25:26.430225947 +0000 UTC m=+6262.191550388" watchObservedRunningTime="2025-12-04 01:25:26.438127702 +0000 UTC m=+6262.199452143" Dec 04 01:25:28 crc kubenswrapper[4764]: I1204 01:25:28.429406 4764 generic.go:334] "Generic (PLEG): container finished" podID="82e554d3-1f76-450f-a5f5-0a350bddb83b" containerID="58f50321737b52e89ec250653ba7284f489e2bc53c2b2d3e749c453f03fb7bad" exitCode=0 Dec 04 01:25:28 crc kubenswrapper[4764]: I1204 01:25:28.429471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k6pbq" event={"ID":"82e554d3-1f76-450f-a5f5-0a350bddb83b","Type":"ContainerDied","Data":"58f50321737b52e89ec250653ba7284f489e2bc53c2b2d3e749c453f03fb7bad"} Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.128203 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.294465 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data\") pod \"82e554d3-1f76-450f-a5f5-0a350bddb83b\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.294710 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data\") pod \"82e554d3-1f76-450f-a5f5-0a350bddb83b\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.294765 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle\") pod \"82e554d3-1f76-450f-a5f5-0a350bddb83b\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.294855 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mjll\" (UniqueName: \"kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll\") pod \"82e554d3-1f76-450f-a5f5-0a350bddb83b\" (UID: \"82e554d3-1f76-450f-a5f5-0a350bddb83b\") " Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.299922 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "82e554d3-1f76-450f-a5f5-0a350bddb83b" (UID: "82e554d3-1f76-450f-a5f5-0a350bddb83b"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.304827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll" (OuterVolumeSpecName: "kube-api-access-4mjll") pod "82e554d3-1f76-450f-a5f5-0a350bddb83b" (UID: "82e554d3-1f76-450f-a5f5-0a350bddb83b"). InnerVolumeSpecName "kube-api-access-4mjll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.306442 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data" (OuterVolumeSpecName: "config-data") pod "82e554d3-1f76-450f-a5f5-0a350bddb83b" (UID: "82e554d3-1f76-450f-a5f5-0a350bddb83b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.327392 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82e554d3-1f76-450f-a5f5-0a350bddb83b" (UID: "82e554d3-1f76-450f-a5f5-0a350bddb83b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.398258 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mjll\" (UniqueName: \"kubernetes.io/projected/82e554d3-1f76-450f-a5f5-0a350bddb83b-kube-api-access-4mjll\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.398562 4764 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.398575 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.398590 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e554d3-1f76-450f-a5f5-0a350bddb83b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.451336 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-k6pbq" event={"ID":"82e554d3-1f76-450f-a5f5-0a350bddb83b","Type":"ContainerDied","Data":"fd665de367eff782b8eaf1e43b4486c99a1bc99fa0f98f03dcad116fa7772146"} Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.451378 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd665de367eff782b8eaf1e43b4486c99a1bc99fa0f98f03dcad116fa7772146" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.451444 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-k6pbq" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.844568 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 01:25:30 crc kubenswrapper[4764]: E1204 01:25:30.845163 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e554d3-1f76-450f-a5f5-0a350bddb83b" containerName="manila-db-sync" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.845187 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e554d3-1f76-450f-a5f5-0a350bddb83b" containerName="manila-db-sync" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.845410 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e554d3-1f76-450f-a5f5-0a350bddb83b" containerName="manila-db-sync" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.846732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.853842 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qfktk" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.854168 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.854324 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.856696 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.867486 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.869380 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.877655 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.889300 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.907134 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.964986 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.967303 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:30 crc kubenswrapper[4764]: I1204 01:25:30.979634 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038220 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038283 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038307 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-ceph\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038460 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-scripts\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038502 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-scripts\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038561 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038679 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038711 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t44tf\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-kube-api-access-t44tf\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.038964 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.039067 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.039094 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.039226 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9bdece-1ce0-4895-aaca-df458215eed1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.039260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mvtq\" (UniqueName: \"kubernetes.io/projected/dc9bdece-1ce0-4895-aaca-df458215eed1-kube-api-access-2mvtq\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.080570 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.082610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.085067 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.111651 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.140872 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-scripts\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.140923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.140974 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.140998 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t44tf\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-kube-api-access-t44tf\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141039 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141065 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141098 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141169 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141186 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141227 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9bdece-1ce0-4895-aaca-df458215eed1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141245 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141263 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mvtq\" (UniqueName: \"kubernetes.io/projected/dc9bdece-1ce0-4895-aaca-df458215eed1-kube-api-access-2mvtq\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141283 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141300 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141316 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-ceph\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff9rw\" (UniqueName: \"kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.141361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-scripts\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.143772 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dc9bdece-1ce0-4895-aaca-df458215eed1-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.144143 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.147271 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.148489 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.148884 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/aec8203f-7269-4248-bf06-a696028aba5c-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.151334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-ceph\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.153094 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-scripts\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.153248 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.153523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-scripts\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.158356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-config-data\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.159601 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc9bdece-1ce0-4895-aaca-df458215eed1-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.163677 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aec8203f-7269-4248-bf06-a696028aba5c-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.164056 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t44tf\" (UniqueName: \"kubernetes.io/projected/aec8203f-7269-4248-bf06-a696028aba5c-kube-api-access-t44tf\") pod \"manila-share-share1-0\" (UID: \"aec8203f-7269-4248-bf06-a696028aba5c\") " pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.180262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mvtq\" (UniqueName: \"kubernetes.io/projected/dc9bdece-1ce0-4895-aaca-df458215eed1-kube-api-access-2mvtq\") pod \"manila-scheduler-0\" (UID: \"dc9bdece-1ce0-4895-aaca-df458215eed1\") " pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.202269 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.213861 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243211 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data-custom\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243264 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243390 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243417 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243448 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff9rw\" (UniqueName: \"kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243466 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243519 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trv2n\" (UniqueName: \"kubernetes.io/projected/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-kube-api-access-trv2n\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243532 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-scripts\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243556 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243578 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-logs\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.243609 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.244471 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.244662 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.245128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.246697 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.260792 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff9rw\" (UniqueName: \"kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw\") pod \"dnsmasq-dns-57bb5675bf-2g4sg\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.311266 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346586 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346707 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv2n\" (UniqueName: \"kubernetes.io/projected/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-kube-api-access-trv2n\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346755 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-scripts\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346791 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346820 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-logs\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-etc-machine-id\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.346864 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data-custom\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.347786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.351996 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-logs\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.352301 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.353462 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-scripts\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.367590 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-config-data-custom\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.369033 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.374359 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv2n\" (UniqueName: \"kubernetes.io/projected/ec1ff801-5a0d-4a01-9366-1b0355a19ca0-kube-api-access-trv2n\") pod \"manila-api-0\" (UID: \"ec1ff801-5a0d-4a01-9366-1b0355a19ca0\") " pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.424832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 04 01:25:31 crc kubenswrapper[4764]: I1204 01:25:31.809106 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 04 01:25:31 crc kubenswrapper[4764]: W1204 01:25:31.810211 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc9bdece_1ce0_4895_aaca_df458215eed1.slice/crio-cc4ed3c35a173820b05b2faab9eefab877fb8629106abc95c1bba36a0ac7d98f WatchSource:0}: Error finding container cc4ed3c35a173820b05b2faab9eefab877fb8629106abc95c1bba36a0ac7d98f: Status 404 returned error can't find the container with id cc4ed3c35a173820b05b2faab9eefab877fb8629106abc95c1bba36a0ac7d98f Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.008904 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 04 01:25:32 crc kubenswrapper[4764]: W1204 01:25:32.021015 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaec8203f_7269_4248_bf06_a696028aba5c.slice/crio-58fb6b4a9d9f97f490de24a344ed608c674721545511874abd4f82ff15e16cd5 WatchSource:0}: Error finding container 58fb6b4a9d9f97f490de24a344ed608c674721545511874abd4f82ff15e16cd5: Status 404 returned error can't find the container with id 58fb6b4a9d9f97f490de24a344ed608c674721545511874abd4f82ff15e16cd5 Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.045070 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.387973 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.501976 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"aec8203f-7269-4248-bf06-a696028aba5c","Type":"ContainerStarted","Data":"58fb6b4a9d9f97f490de24a344ed608c674721545511874abd4f82ff15e16cd5"} Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.507850 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" event={"ID":"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37","Type":"ContainerStarted","Data":"6093236fd6d150727c0f785cf013d62f793428b8d912f4d966f51a887e9b8dd1"} Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.516907 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dc9bdece-1ce0-4895-aaca-df458215eed1","Type":"ContainerStarted","Data":"cc4ed3c35a173820b05b2faab9eefab877fb8629106abc95c1bba36a0ac7d98f"} Dec 04 01:25:32 crc kubenswrapper[4764]: I1204 01:25:32.547674 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:25:32 crc kubenswrapper[4764]: E1204 01:25:32.548172 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:25:33 crc kubenswrapper[4764]: I1204 01:25:33.531968 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dc9bdece-1ce0-4895-aaca-df458215eed1","Type":"ContainerStarted","Data":"8009279756f1de942cf24d734264bb6ed166de324560bf9f78eb780da77d2a93"} Dec 04 01:25:33 crc kubenswrapper[4764]: I1204 01:25:33.536547 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerID="77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5" exitCode=0 Dec 04 01:25:33 crc kubenswrapper[4764]: I1204 01:25:33.536630 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" event={"ID":"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37","Type":"ContainerDied","Data":"77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5"} Dec 04 01:25:33 crc kubenswrapper[4764]: I1204 01:25:33.547161 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ec1ff801-5a0d-4a01-9366-1b0355a19ca0","Type":"ContainerStarted","Data":"abb828f0584fd526360dacf6a953df6f9989e5fada2e8c5924740abc138b9d8f"} Dec 04 01:25:33 crc kubenswrapper[4764]: I1204 01:25:33.547288 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ec1ff801-5a0d-4a01-9366-1b0355a19ca0","Type":"ContainerStarted","Data":"dda8ef8b10c86e6d3c1d52bfcf44f697e7e7de2715f746b99fba50a0ba1df3e2"} Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.565310 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"dc9bdece-1ce0-4895-aaca-df458215eed1","Type":"ContainerStarted","Data":"7a8eed6d86825c629b91b2ebd9b697201b3f927cfe3a2a91e17b2d5a832bd3c3"} Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.567898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" event={"ID":"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37","Type":"ContainerStarted","Data":"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806"} Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.568785 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.570492 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"ec1ff801-5a0d-4a01-9366-1b0355a19ca0","Type":"ContainerStarted","Data":"e9f9a0fb7669edc7961bc75c7ce9f78841da88d0b4e1c8a376b44e34709cf8f5"} Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.571055 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.657380 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.7293813460000003 podStartE2EDuration="4.65736273s" podCreationTimestamp="2025-12-04 01:25:30 +0000 UTC" firstStartedPulling="2025-12-04 01:25:31.811948355 +0000 UTC m=+6267.573272766" lastFinishedPulling="2025-12-04 01:25:32.739929739 +0000 UTC m=+6268.501254150" observedRunningTime="2025-12-04 01:25:34.649150398 +0000 UTC m=+6270.410474809" watchObservedRunningTime="2025-12-04 01:25:34.65736273 +0000 UTC m=+6270.418687141" Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.677130 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" podStartSLOduration=4.677109146 podStartE2EDuration="4.677109146s" podCreationTimestamp="2025-12-04 01:25:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:25:34.668011322 +0000 UTC m=+6270.429335733" watchObservedRunningTime="2025-12-04 01:25:34.677109146 +0000 UTC m=+6270.438433557" Dec 04 01:25:34 crc kubenswrapper[4764]: I1204 01:25:34.703425 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.703404933 podStartE2EDuration="3.703404933s" podCreationTimestamp="2025-12-04 01:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:25:34.696203496 +0000 UTC m=+6270.457527907" watchObservedRunningTime="2025-12-04 01:25:34.703404933 +0000 UTC m=+6270.464729344" Dec 04 01:25:38 crc kubenswrapper[4764]: I1204 01:25:38.923813 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 01:25:40 crc kubenswrapper[4764]: I1204 01:25:40.639785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"aec8203f-7269-4248-bf06-a696028aba5c","Type":"ContainerStarted","Data":"55018dfda052a9e2bb463bd436b7e887d4d8ced353e2f4720a0ec9f561fd8aa1"} Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.202674 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.313774 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.388527 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.389009 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="dnsmasq-dns" containerID="cri-o://1ce0ec671b8a052db3202bacd52e10f2b29f63289079f6235e6dfd6550497690" gracePeriod=10 Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.654145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" event={"ID":"27f4424a-d562-42ae-aaac-879944c2134b","Type":"ContainerDied","Data":"1ce0ec671b8a052db3202bacd52e10f2b29f63289079f6235e6dfd6550497690"} Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.654088 4764 generic.go:334] "Generic (PLEG): container finished" podID="27f4424a-d562-42ae-aaac-879944c2134b" containerID="1ce0ec671b8a052db3202bacd52e10f2b29f63289079f6235e6dfd6550497690" exitCode=0 Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.657672 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"aec8203f-7269-4248-bf06-a696028aba5c","Type":"ContainerStarted","Data":"bfd1c54fb016e2d113d9f4cbe3a570470a0a800a8cb1b9025937919066f508e7"} Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.688037 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.994339649 podStartE2EDuration="11.688018859s" podCreationTimestamp="2025-12-04 01:25:30 +0000 UTC" firstStartedPulling="2025-12-04 01:25:32.030526236 +0000 UTC m=+6267.791850647" lastFinishedPulling="2025-12-04 01:25:39.724205436 +0000 UTC m=+6275.485529857" observedRunningTime="2025-12-04 01:25:41.675763297 +0000 UTC m=+6277.437087708" watchObservedRunningTime="2025-12-04 01:25:41.688018859 +0000 UTC m=+6277.449343270" Dec 04 01:25:41 crc kubenswrapper[4764]: I1204 01:25:41.948512 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.138359 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb\") pod \"27f4424a-d562-42ae-aaac-879944c2134b\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.138417 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86trz\" (UniqueName: \"kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz\") pod \"27f4424a-d562-42ae-aaac-879944c2134b\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.138509 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc\") pod \"27f4424a-d562-42ae-aaac-879944c2134b\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.139409 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb\") pod \"27f4424a-d562-42ae-aaac-879944c2134b\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.139449 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config\") pod \"27f4424a-d562-42ae-aaac-879944c2134b\" (UID: \"27f4424a-d562-42ae-aaac-879944c2134b\") " Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.145029 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz" (OuterVolumeSpecName: "kube-api-access-86trz") pod "27f4424a-d562-42ae-aaac-879944c2134b" (UID: "27f4424a-d562-42ae-aaac-879944c2134b"). InnerVolumeSpecName "kube-api-access-86trz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.204856 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27f4424a-d562-42ae-aaac-879944c2134b" (UID: "27f4424a-d562-42ae-aaac-879944c2134b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.205227 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27f4424a-d562-42ae-aaac-879944c2134b" (UID: "27f4424a-d562-42ae-aaac-879944c2134b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.229845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27f4424a-d562-42ae-aaac-879944c2134b" (UID: "27f4424a-d562-42ae-aaac-879944c2134b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.234342 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config" (OuterVolumeSpecName: "config") pod "27f4424a-d562-42ae-aaac-879944c2134b" (UID: "27f4424a-d562-42ae-aaac-879944c2134b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.244603 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.244637 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86trz\" (UniqueName: \"kubernetes.io/projected/27f4424a-d562-42ae-aaac-879944c2134b-kube-api-access-86trz\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.244653 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.244666 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.244676 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27f4424a-d562-42ae-aaac-879944c2134b-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.671014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" event={"ID":"27f4424a-d562-42ae-aaac-879944c2134b","Type":"ContainerDied","Data":"29eeaa7021392724baeabb350949b3822d6ceee8a2e5c7112cdf4e195110435a"} Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.671069 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd4b7497f-nls86" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.671417 4764 scope.go:117] "RemoveContainer" containerID="1ce0ec671b8a052db3202bacd52e10f2b29f63289079f6235e6dfd6550497690" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.702212 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.706612 4764 scope.go:117] "RemoveContainer" containerID="3a918550b44845f01662d4bd4a87f09926fde95ff6eb29fd60f4ac6a9896bd6a" Dec 04 01:25:42 crc kubenswrapper[4764]: I1204 01:25:42.714043 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd4b7497f-nls86"] Dec 04 01:25:43 crc kubenswrapper[4764]: I1204 01:25:43.839598 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:43 crc kubenswrapper[4764]: I1204 01:25:43.839853 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-central-agent" containerID="cri-o://ee5e227bfdecbc2e04e1736aacac38721fe698bf6d703eb2c8b3733736b87495" gracePeriod=30 Dec 04 01:25:43 crc kubenswrapper[4764]: I1204 01:25:43.839958 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="proxy-httpd" containerID="cri-o://cf29d6ae6d16c38f90e772a95eaf9cf0f4f4ed6a2c49a23f65c44ef8d4f4dad6" gracePeriod=30 Dec 04 01:25:43 crc kubenswrapper[4764]: I1204 01:25:43.839993 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="sg-core" containerID="cri-o://2d1ee2d0255ef7fad12415a428ea5ab3d5ab3bc8ef22c5e73e2b933adcc1bd9c" gracePeriod=30 Dec 04 01:25:43 crc kubenswrapper[4764]: I1204 01:25:43.840023 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-notification-agent" containerID="cri-o://d6f64d9129f0da2fbaed7bcfc7ae498d93f0ffb3a19d68ad4491ee4bac16d0f7" gracePeriod=30 Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.561565 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f4424a-d562-42ae-aaac-879944c2134b" path="/var/lib/kubelet/pods/27f4424a-d562-42ae-aaac-879944c2134b/volumes" Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695498 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerID="cf29d6ae6d16c38f90e772a95eaf9cf0f4f4ed6a2c49a23f65c44ef8d4f4dad6" exitCode=0 Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695529 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerID="2d1ee2d0255ef7fad12415a428ea5ab3d5ab3bc8ef22c5e73e2b933adcc1bd9c" exitCode=2 Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695539 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerID="ee5e227bfdecbc2e04e1736aacac38721fe698bf6d703eb2c8b3733736b87495" exitCode=0 Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695558 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerDied","Data":"cf29d6ae6d16c38f90e772a95eaf9cf0f4f4ed6a2c49a23f65c44ef8d4f4dad6"} Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695599 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerDied","Data":"2d1ee2d0255ef7fad12415a428ea5ab3d5ab3bc8ef22c5e73e2b933adcc1bd9c"} Dec 04 01:25:44 crc kubenswrapper[4764]: I1204 01:25:44.695616 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerDied","Data":"ee5e227bfdecbc2e04e1736aacac38721fe698bf6d703eb2c8b3733736b87495"} Dec 04 01:25:46 crc kubenswrapper[4764]: I1204 01:25:46.547375 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:25:46 crc kubenswrapper[4764]: E1204 01:25:46.548831 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.734294 4764 generic.go:334] "Generic (PLEG): container finished" podID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerID="d6f64d9129f0da2fbaed7bcfc7ae498d93f0ffb3a19d68ad4491ee4bac16d0f7" exitCode=0 Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.734381 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerDied","Data":"d6f64d9129f0da2fbaed7bcfc7ae498d93f0ffb3a19d68ad4491ee4bac16d0f7"} Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.734637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5","Type":"ContainerDied","Data":"a81df948180e45c21bc3e952fc3ad76955eaf833b5e173b995cf277baf6f57cb"} Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.734660 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a81df948180e45c21bc3e952fc3ad76955eaf833b5e173b995cf277baf6f57cb" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.766091 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.860466 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.860851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.860871 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.860957 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861049 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zp89\" (UniqueName: \"kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861080 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861139 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861180 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861226 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data\") pod \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\" (UID: \"bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5\") " Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861849 4764 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.861868 4764 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.867139 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts" (OuterVolumeSpecName: "scripts") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.868440 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89" (OuterVolumeSpecName: "kube-api-access-6zp89") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "kube-api-access-6zp89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.904894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.956918 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.963311 4764 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.963346 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zp89\" (UniqueName: \"kubernetes.io/projected/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-kube-api-access-6zp89\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.963356 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.963366 4764 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:47 crc kubenswrapper[4764]: I1204 01:25:47.978770 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data" (OuterVolumeSpecName: "config-data") pod "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" (UID: "bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.065224 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.744263 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.772054 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.781248 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.795811 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797380 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="proxy-httpd" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797435 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="proxy-httpd" Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797474 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="init" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797487 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="init" Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797521 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-central-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797535 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-central-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797569 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="sg-core" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797581 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="sg-core" Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797610 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-notification-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797623 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-notification-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: E1204 01:25:48.797643 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="dnsmasq-dns" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.797657 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="dnsmasq-dns" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.798100 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="sg-core" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.798157 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-central-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.798192 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="proxy-httpd" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.798209 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" containerName="ceilometer-notification-agent" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.798233 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f4424a-d562-42ae-aaac-879944c2134b" containerName="dnsmasq-dns" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.801739 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.804873 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.804882 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.860081 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b284z\" (UniqueName: \"kubernetes.io/projected/433a26a8-5ca9-408a-a45c-3ab1326328df-kube-api-access-b284z\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-config-data\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986819 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-run-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986839 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.986866 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-scripts\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:48 crc kubenswrapper[4764]: I1204 01:25:48.987435 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-log-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089229 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-log-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089543 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b284z\" (UniqueName: \"kubernetes.io/projected/433a26a8-5ca9-408a-a45c-3ab1326328df-kube-api-access-b284z\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089634 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-config-data\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089736 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-run-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089875 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.089958 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-scripts\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.090207 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-log-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.090213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/433a26a8-5ca9-408a-a45c-3ab1326328df-run-httpd\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.097448 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.098709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-scripts\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.104292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.106085 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/433a26a8-5ca9-408a-a45c-3ab1326328df-config-data\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.115441 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b284z\" (UniqueName: \"kubernetes.io/projected/433a26a8-5ca9-408a-a45c-3ab1326328df-kube-api-access-b284z\") pod \"ceilometer-0\" (UID: \"433a26a8-5ca9-408a-a45c-3ab1326328df\") " pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.174170 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.708125 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 01:25:49 crc kubenswrapper[4764]: I1204 01:25:49.753903 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"433a26a8-5ca9-408a-a45c-3ab1326328df","Type":"ContainerStarted","Data":"731394cb0a992aa12fb21e281854d44774bbaeb6fbb06aab438b26390e023d32"} Dec 04 01:25:50 crc kubenswrapper[4764]: I1204 01:25:50.557345 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5" path="/var/lib/kubelet/pods/bbddfbbe-ea5b-4068-8e0e-aaf8991b95d5/volumes" Dec 04 01:25:50 crc kubenswrapper[4764]: I1204 01:25:50.766896 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"433a26a8-5ca9-408a-a45c-3ab1326328df","Type":"ContainerStarted","Data":"f3db1e1f970f8863e20f55a9e10c1c009e52cb0ff250f36470caabc95497e3e4"} Dec 04 01:25:51 crc kubenswrapper[4764]: I1204 01:25:51.061222 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-bfttz"] Dec 04 01:25:51 crc kubenswrapper[4764]: I1204 01:25:51.073799 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-bfttz"] Dec 04 01:25:51 crc kubenswrapper[4764]: I1204 01:25:51.214631 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 04 01:25:51 crc kubenswrapper[4764]: I1204 01:25:51.792564 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"433a26a8-5ca9-408a-a45c-3ab1326328df","Type":"ContainerStarted","Data":"ced6f53f0692bf51c8b986f23a9e8e77368af456b9451d264ae7c8519fc49828"} Dec 04 01:25:52 crc kubenswrapper[4764]: I1204 01:25:52.046753 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8f64-account-create-update-phfqx"] Dec 04 01:25:52 crc kubenswrapper[4764]: I1204 01:25:52.065192 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8f64-account-create-update-phfqx"] Dec 04 01:25:52 crc kubenswrapper[4764]: I1204 01:25:52.702221 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d" path="/var/lib/kubelet/pods/0e9ca50e-771a-4f4a-94ef-2a54aeb2a83d/volumes" Dec 04 01:25:52 crc kubenswrapper[4764]: I1204 01:25:52.703782 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f825b0f-a1a7-4fc7-94f2-44e076ada8ac" path="/var/lib/kubelet/pods/3f825b0f-a1a7-4fc7-94f2-44e076ada8ac/volumes" Dec 04 01:25:52 crc kubenswrapper[4764]: I1204 01:25:52.802185 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"433a26a8-5ca9-408a-a45c-3ab1326328df","Type":"ContainerStarted","Data":"c7fb4dcff406f3c49875294ae475623291f7e5ca4698a20a1bfa23c32cc57be9"} Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.129830 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.287916 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.304872 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.818392 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"433a26a8-5ca9-408a-a45c-3ab1326328df","Type":"ContainerStarted","Data":"082af3ff94b2c6e0014f7fa78e959739a4843cc63299a2c2acb51f437aa4d669"} Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.818779 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 01:25:53 crc kubenswrapper[4764]: I1204 01:25:53.846981 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2398077020000002 podStartE2EDuration="5.846961117s" podCreationTimestamp="2025-12-04 01:25:48 +0000 UTC" firstStartedPulling="2025-12-04 01:25:49.695421561 +0000 UTC m=+6285.456745972" lastFinishedPulling="2025-12-04 01:25:53.302574976 +0000 UTC m=+6289.063899387" observedRunningTime="2025-12-04 01:25:53.837049583 +0000 UTC m=+6289.598374004" watchObservedRunningTime="2025-12-04 01:25:53.846961117 +0000 UTC m=+6289.608285528" Dec 04 01:26:00 crc kubenswrapper[4764]: I1204 01:26:00.038050 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ds7ng"] Dec 04 01:26:00 crc kubenswrapper[4764]: I1204 01:26:00.052605 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ds7ng"] Dec 04 01:26:00 crc kubenswrapper[4764]: I1204 01:26:00.566971 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c543fb01-7118-455a-a07b-2618a4c0368a" path="/var/lib/kubelet/pods/c543fb01-7118-455a-a07b-2618a4c0368a/volumes" Dec 04 01:26:01 crc kubenswrapper[4764]: I1204 01:26:01.546526 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:26:01 crc kubenswrapper[4764]: E1204 01:26:01.547517 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:26:13 crc kubenswrapper[4764]: I1204 01:26:13.546878 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:26:13 crc kubenswrapper[4764]: E1204 01:26:13.547868 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:26:14 crc kubenswrapper[4764]: I1204 01:26:14.699835 4764 scope.go:117] "RemoveContainer" containerID="370a8648c8b29a2d57c0cd6b73bf6e54e4feb331d5b86df4b7de1975d039f6a2" Dec 04 01:26:14 crc kubenswrapper[4764]: I1204 01:26:14.727975 4764 scope.go:117] "RemoveContainer" containerID="41ea1f1a60e912880372494e25f2c8e40bee298af1458850dfd793690d496454" Dec 04 01:26:14 crc kubenswrapper[4764]: I1204 01:26:14.794040 4764 scope.go:117] "RemoveContainer" containerID="888c0b683b0853fbb70012ad304a4b9bf88f289c8958101f0b312cccdc96942a" Dec 04 01:26:14 crc kubenswrapper[4764]: I1204 01:26:14.822043 4764 scope.go:117] "RemoveContainer" containerID="c57e6dd1f938f64ce27864009cffb12191dc4c512f1ee10f7f4449a09a5e11ae" Dec 04 01:26:14 crc kubenswrapper[4764]: I1204 01:26:14.873208 4764 scope.go:117] "RemoveContainer" containerID="7b52487e1c43319a44c0b96d4caed9abfc8996deb92b33a530ca4df7e8a61b41" Dec 04 01:26:19 crc kubenswrapper[4764]: I1204 01:26:19.179456 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 01:26:25 crc kubenswrapper[4764]: I1204 01:26:25.545980 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:26:26 crc kubenswrapper[4764]: I1204 01:26:26.184070 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318"} Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.066114 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.074881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.075071 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.194144 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.194438 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.194594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlx9w\" (UniqueName: \"kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.296934 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlx9w\" (UniqueName: \"kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.297078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.297108 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.297523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.297620 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.315336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlx9w\" (UniqueName: \"kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w\") pod \"redhat-operators-vsrbd\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.397499 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:35 crc kubenswrapper[4764]: I1204 01:26:35.917592 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:26:36 crc kubenswrapper[4764]: I1204 01:26:36.292382 4764 generic.go:334] "Generic (PLEG): container finished" podID="5073763c-22c9-44a6-8d65-68497c5c4416" containerID="f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311" exitCode=0 Dec 04 01:26:36 crc kubenswrapper[4764]: I1204 01:26:36.292440 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerDied","Data":"f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311"} Dec 04 01:26:36 crc kubenswrapper[4764]: I1204 01:26:36.292682 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerStarted","Data":"fa4ffd486e1947940ff3d317b17f5ea1b291cb3d910110f189061561ff28cc82"} Dec 04 01:26:36 crc kubenswrapper[4764]: I1204 01:26:36.295100 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:26:37 crc kubenswrapper[4764]: I1204 01:26:37.305499 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerStarted","Data":"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d"} Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.016471 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.023181 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.025448 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.029572 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067155 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlv5c\" (UniqueName: \"kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.067583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169270 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlv5c\" (UniqueName: \"kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169408 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169463 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.169520 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.170279 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.170292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.170356 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.170889 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.170922 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.192474 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlv5c\" (UniqueName: \"kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c\") pod \"dnsmasq-dns-8676f85d89-wxvxl\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:38 crc kubenswrapper[4764]: I1204 01:26:38.353701 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:39 crc kubenswrapper[4764]: W1204 01:26:39.035570 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb3cf21a_5e60_48dd_bbbe_efb028ad476a.slice/crio-f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25 WatchSource:0}: Error finding container f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25: Status 404 returned error can't find the container with id f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25 Dec 04 01:26:39 crc kubenswrapper[4764]: I1204 01:26:39.045092 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:26:39 crc kubenswrapper[4764]: I1204 01:26:39.326692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" event={"ID":"bb3cf21a-5e60-48dd-bbbe-efb028ad476a","Type":"ContainerStarted","Data":"f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25"} Dec 04 01:26:40 crc kubenswrapper[4764]: I1204 01:26:40.337869 4764 generic.go:334] "Generic (PLEG): container finished" podID="5073763c-22c9-44a6-8d65-68497c5c4416" containerID="3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d" exitCode=0 Dec 04 01:26:40 crc kubenswrapper[4764]: I1204 01:26:40.337978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerDied","Data":"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d"} Dec 04 01:26:40 crc kubenswrapper[4764]: I1204 01:26:40.341840 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerID="15d6739e5c42d314ac6d1c65fe548e4b28462c5f91099b8acce1e4219dd6a9e4" exitCode=0 Dec 04 01:26:40 crc kubenswrapper[4764]: I1204 01:26:40.341882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" event={"ID":"bb3cf21a-5e60-48dd-bbbe-efb028ad476a","Type":"ContainerDied","Data":"15d6739e5c42d314ac6d1c65fe548e4b28462c5f91099b8acce1e4219dd6a9e4"} Dec 04 01:26:41 crc kubenswrapper[4764]: I1204 01:26:41.352066 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" event={"ID":"bb3cf21a-5e60-48dd-bbbe-efb028ad476a","Type":"ContainerStarted","Data":"ec852aa21f647bbf1714c52d827fc9cfccb6605fd13807cc78b9555f6c453729"} Dec 04 01:26:41 crc kubenswrapper[4764]: I1204 01:26:41.353235 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:41 crc kubenswrapper[4764]: I1204 01:26:41.356791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerStarted","Data":"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f"} Dec 04 01:26:41 crc kubenswrapper[4764]: I1204 01:26:41.421961 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" podStartSLOduration=4.421946189 podStartE2EDuration="4.421946189s" podCreationTimestamp="2025-12-04 01:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:26:41.408844437 +0000 UTC m=+6337.170168858" watchObservedRunningTime="2025-12-04 01:26:41.421946189 +0000 UTC m=+6337.183270600" Dec 04 01:26:41 crc kubenswrapper[4764]: I1204 01:26:41.470408 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vsrbd" podStartSLOduration=2.039368867 podStartE2EDuration="6.470392982s" podCreationTimestamp="2025-12-04 01:26:35 +0000 UTC" firstStartedPulling="2025-12-04 01:26:36.29488175 +0000 UTC m=+6332.056206161" lastFinishedPulling="2025-12-04 01:26:40.725905865 +0000 UTC m=+6336.487230276" observedRunningTime="2025-12-04 01:26:41.468216848 +0000 UTC m=+6337.229541259" watchObservedRunningTime="2025-12-04 01:26:41.470392982 +0000 UTC m=+6337.231717393" Dec 04 01:26:45 crc kubenswrapper[4764]: I1204 01:26:45.397741 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:45 crc kubenswrapper[4764]: I1204 01:26:45.398430 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:26:46 crc kubenswrapper[4764]: I1204 01:26:46.459125 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vsrbd" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" probeResult="failure" output=< Dec 04 01:26:46 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:26:46 crc kubenswrapper[4764]: > Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.355766 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.432850 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.433441 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="dnsmasq-dns" containerID="cri-o://6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806" gracePeriod=10 Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.620521 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5df9d8c4b7-vnhgm"] Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.625061 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.639643 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df9d8c4b7-vnhgm"] Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819265 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmg9\" (UniqueName: \"kubernetes.io/projected/1d7015a3-19f2-4c8c-aba4-add826c42a61-kube-api-access-xdmg9\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819317 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-dns-svc\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819425 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-openstack-cell1\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819523 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-sb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819546 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-config\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.819581 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-nb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.922452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-nb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.922841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmg9\" (UniqueName: \"kubernetes.io/projected/1d7015a3-19f2-4c8c-aba4-add826c42a61-kube-api-access-xdmg9\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.922878 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-dns-svc\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.922980 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-openstack-cell1\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.923078 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-sb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.923105 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-config\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.923271 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-nb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.923816 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-config\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.925548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-dns-svc\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.925577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-ovsdbserver-sb\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.925560 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1d7015a3-19f2-4c8c-aba4-add826c42a61-openstack-cell1\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.940626 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmg9\" (UniqueName: \"kubernetes.io/projected/1d7015a3-19f2-4c8c-aba4-add826c42a61-kube-api-access-xdmg9\") pod \"dnsmasq-dns-5df9d8c4b7-vnhgm\" (UID: \"1d7015a3-19f2-4c8c-aba4-add826c42a61\") " pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:48 crc kubenswrapper[4764]: I1204 01:26:48.951680 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.079323 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.229944 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config\") pod \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.230306 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc\") pod \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.230356 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb\") pod \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.230395 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff9rw\" (UniqueName: \"kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw\") pod \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.230447 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb\") pod \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\" (UID: \"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37\") " Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.237209 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw" (OuterVolumeSpecName: "kube-api-access-ff9rw") pod "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" (UID: "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37"). InnerVolumeSpecName "kube-api-access-ff9rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.285329 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" (UID: "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.298940 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" (UID: "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.303966 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" (UID: "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.322126 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config" (OuterVolumeSpecName: "config") pod "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" (UID: "f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.332779 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.332812 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.332823 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.332834 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff9rw\" (UniqueName: \"kubernetes.io/projected/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-kube-api-access-ff9rw\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.332843 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.455296 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5df9d8c4b7-vnhgm"] Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.456809 4764 generic.go:334] "Generic (PLEG): container finished" podID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerID="6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806" exitCode=0 Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.456848 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" event={"ID":"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37","Type":"ContainerDied","Data":"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806"} Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.456857 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.456872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57bb5675bf-2g4sg" event={"ID":"f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37","Type":"ContainerDied","Data":"6093236fd6d150727c0f785cf013d62f793428b8d912f4d966f51a887e9b8dd1"} Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.456890 4764 scope.go:117] "RemoveContainer" containerID="6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.494448 4764 scope.go:117] "RemoveContainer" containerID="77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.501272 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.513513 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57bb5675bf-2g4sg"] Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.518987 4764 scope.go:117] "RemoveContainer" containerID="6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806" Dec 04 01:26:49 crc kubenswrapper[4764]: E1204 01:26:49.519603 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806\": container with ID starting with 6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806 not found: ID does not exist" containerID="6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.519648 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806"} err="failed to get container status \"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806\": rpc error: code = NotFound desc = could not find container \"6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806\": container with ID starting with 6a68f821b358f8fa8ca3c15480bb41df1f3fd62e8dbd51d1b3bc801180fed806 not found: ID does not exist" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.519675 4764 scope.go:117] "RemoveContainer" containerID="77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5" Dec 04 01:26:49 crc kubenswrapper[4764]: E1204 01:26:49.520057 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5\": container with ID starting with 77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5 not found: ID does not exist" containerID="77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5" Dec 04 01:26:49 crc kubenswrapper[4764]: I1204 01:26:49.520084 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5"} err="failed to get container status \"77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5\": rpc error: code = NotFound desc = could not find container \"77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5\": container with ID starting with 77ba8c450dd41d9f828cc5effe55eef4dc98b7e935d18726eea5875e3ecae6f5 not found: ID does not exist" Dec 04 01:26:50 crc kubenswrapper[4764]: I1204 01:26:50.468226 4764 generic.go:334] "Generic (PLEG): container finished" podID="1d7015a3-19f2-4c8c-aba4-add826c42a61" containerID="c9b78ab186501c2d65449ff54c146fe856f5c423d325c99e8038e36667f2897f" exitCode=0 Dec 04 01:26:50 crc kubenswrapper[4764]: I1204 01:26:50.468317 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" event={"ID":"1d7015a3-19f2-4c8c-aba4-add826c42a61","Type":"ContainerDied","Data":"c9b78ab186501c2d65449ff54c146fe856f5c423d325c99e8038e36667f2897f"} Dec 04 01:26:50 crc kubenswrapper[4764]: I1204 01:26:50.468660 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" event={"ID":"1d7015a3-19f2-4c8c-aba4-add826c42a61","Type":"ContainerStarted","Data":"83902986097a8ee0c7c2cac6505b2f33503df230f63eb580eca5461626c7861e"} Dec 04 01:26:50 crc kubenswrapper[4764]: I1204 01:26:50.577614 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" path="/var/lib/kubelet/pods/f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37/volumes" Dec 04 01:26:51 crc kubenswrapper[4764]: I1204 01:26:51.487081 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" event={"ID":"1d7015a3-19f2-4c8c-aba4-add826c42a61","Type":"ContainerStarted","Data":"78639aae75ac75b389a0928fa6f1a750d611913fc719e84e558ae03665aa08af"} Dec 04 01:26:51 crc kubenswrapper[4764]: I1204 01:26:51.487396 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:51 crc kubenswrapper[4764]: I1204 01:26:51.511741 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" podStartSLOduration=3.511708663 podStartE2EDuration="3.511708663s" podCreationTimestamp="2025-12-04 01:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 01:26:51.50753935 +0000 UTC m=+6347.268863771" watchObservedRunningTime="2025-12-04 01:26:51.511708663 +0000 UTC m=+6347.273033094" Dec 04 01:26:56 crc kubenswrapper[4764]: I1204 01:26:56.454970 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vsrbd" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" probeResult="failure" output=< Dec 04 01:26:56 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:26:56 crc kubenswrapper[4764]: > Dec 04 01:26:58 crc kubenswrapper[4764]: I1204 01:26:58.954017 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5df9d8c4b7-vnhgm" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.057603 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.057882 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="dnsmasq-dns" containerID="cri-o://ec852aa21f647bbf1714c52d827fc9cfccb6605fd13807cc78b9555f6c453729" gracePeriod=10 Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.592811 4764 generic.go:334] "Generic (PLEG): container finished" podID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerID="ec852aa21f647bbf1714c52d827fc9cfccb6605fd13807cc78b9555f6c453729" exitCode=0 Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.592852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" event={"ID":"bb3cf21a-5e60-48dd-bbbe-efb028ad476a","Type":"ContainerDied","Data":"ec852aa21f647bbf1714c52d827fc9cfccb6605fd13807cc78b9555f6c453729"} Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.593125 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" event={"ID":"bb3cf21a-5e60-48dd-bbbe-efb028ad476a","Type":"ContainerDied","Data":"f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25"} Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.593138 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07ae7985a7bd85f2b52bef0318dd769bd67f22cb34527c6be3f44572b5d4d25" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.652915 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.792389 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.792583 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.792844 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.792876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlv5c\" (UniqueName: \"kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.792991 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.793027 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1\") pod \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\" (UID: \"bb3cf21a-5e60-48dd-bbbe-efb028ad476a\") " Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.804936 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c" (OuterVolumeSpecName: "kube-api-access-tlv5c") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "kube-api-access-tlv5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.857306 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.860214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.860592 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.863944 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config" (OuterVolumeSpecName: "config") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.875901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb3cf21a-5e60-48dd-bbbe-efb028ad476a" (UID: "bb3cf21a-5e60-48dd-bbbe-efb028ad476a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.909848 4764 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-config\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.910196 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.910545 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlv5c\" (UniqueName: \"kubernetes.io/projected/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-kube-api-access-tlv5c\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.910643 4764 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.910738 4764 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 04 01:26:59 crc kubenswrapper[4764]: I1204 01:26:59.910823 4764 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb3cf21a-5e60-48dd-bbbe-efb028ad476a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:00 crc kubenswrapper[4764]: I1204 01:27:00.601965 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8676f85d89-wxvxl" Dec 04 01:27:00 crc kubenswrapper[4764]: I1204 01:27:00.630847 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:27:00 crc kubenswrapper[4764]: I1204 01:27:00.644775 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8676f85d89-wxvxl"] Dec 04 01:27:02 crc kubenswrapper[4764]: I1204 01:27:02.559605 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" path="/var/lib/kubelet/pods/bb3cf21a-5e60-48dd-bbbe-efb028ad476a/volumes" Dec 04 01:27:05 crc kubenswrapper[4764]: I1204 01:27:05.482211 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:27:05 crc kubenswrapper[4764]: I1204 01:27:05.553283 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:27:06 crc kubenswrapper[4764]: I1204 01:27:06.249170 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:27:06 crc kubenswrapper[4764]: I1204 01:27:06.657285 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vsrbd" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" containerID="cri-o://c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f" gracePeriod=2 Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.230441 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.382655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities\") pod \"5073763c-22c9-44a6-8d65-68497c5c4416\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.382851 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content\") pod \"5073763c-22c9-44a6-8d65-68497c5c4416\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.382969 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlx9w\" (UniqueName: \"kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w\") pod \"5073763c-22c9-44a6-8d65-68497c5c4416\" (UID: \"5073763c-22c9-44a6-8d65-68497c5c4416\") " Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.383393 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities" (OuterVolumeSpecName: "utilities") pod "5073763c-22c9-44a6-8d65-68497c5c4416" (UID: "5073763c-22c9-44a6-8d65-68497c5c4416"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.383742 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.390405 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w" (OuterVolumeSpecName: "kube-api-access-tlx9w") pod "5073763c-22c9-44a6-8d65-68497c5c4416" (UID: "5073763c-22c9-44a6-8d65-68497c5c4416"). InnerVolumeSpecName "kube-api-access-tlx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.472555 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5073763c-22c9-44a6-8d65-68497c5c4416" (UID: "5073763c-22c9-44a6-8d65-68497c5c4416"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.486178 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5073763c-22c9-44a6-8d65-68497c5c4416-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.486214 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlx9w\" (UniqueName: \"kubernetes.io/projected/5073763c-22c9-44a6-8d65-68497c5c4416-kube-api-access-tlx9w\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.668858 4764 generic.go:334] "Generic (PLEG): container finished" podID="5073763c-22c9-44a6-8d65-68497c5c4416" containerID="c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f" exitCode=0 Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.668905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerDied","Data":"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f"} Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.668936 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vsrbd" event={"ID":"5073763c-22c9-44a6-8d65-68497c5c4416","Type":"ContainerDied","Data":"fa4ffd486e1947940ff3d317b17f5ea1b291cb3d910110f189061561ff28cc82"} Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.668955 4764 scope.go:117] "RemoveContainer" containerID="c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.671477 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vsrbd" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.700440 4764 scope.go:117] "RemoveContainer" containerID="3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.730785 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.750992 4764 scope.go:117] "RemoveContainer" containerID="f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.760070 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vsrbd"] Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.794885 4764 scope.go:117] "RemoveContainer" containerID="c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f" Dec 04 01:27:07 crc kubenswrapper[4764]: E1204 01:27:07.795572 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f\": container with ID starting with c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f not found: ID does not exist" containerID="c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.795607 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f"} err="failed to get container status \"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f\": rpc error: code = NotFound desc = could not find container \"c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f\": container with ID starting with c68b71b02fbcd1c5563f405b39ea3540d216c9051b4b87e35c37588139c47a3f not found: ID does not exist" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.795631 4764 scope.go:117] "RemoveContainer" containerID="3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d" Dec 04 01:27:07 crc kubenswrapper[4764]: E1204 01:27:07.795970 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d\": container with ID starting with 3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d not found: ID does not exist" containerID="3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.795998 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d"} err="failed to get container status \"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d\": rpc error: code = NotFound desc = could not find container \"3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d\": container with ID starting with 3cdbde594cd0c6aa5da2cdbb0a06cc6ca2e3780a3151fb577fc44ac11ef2706d not found: ID does not exist" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.796014 4764 scope.go:117] "RemoveContainer" containerID="f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311" Dec 04 01:27:07 crc kubenswrapper[4764]: E1204 01:27:07.796241 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311\": container with ID starting with f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311 not found: ID does not exist" containerID="f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311" Dec 04 01:27:07 crc kubenswrapper[4764]: I1204 01:27:07.796261 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311"} err="failed to get container status \"f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311\": rpc error: code = NotFound desc = could not find container \"f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311\": container with ID starting with f0fb5607426a903ca3376d140341388b518f8f4c62dfb099e2e622a186a8f311 not found: ID does not exist" Dec 04 01:27:08 crc kubenswrapper[4764]: I1204 01:27:08.566323 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" path="/var/lib/kubelet/pods/5073763c-22c9-44a6-8d65-68497c5c4416/volumes" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.384499 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg"] Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385548 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="init" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385572 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="init" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385612 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="extract-utilities" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385624 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="extract-utilities" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385647 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385658 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385683 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="extract-content" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385693 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="extract-content" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385709 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385742 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385760 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385772 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: E1204 01:27:09.385806 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="init" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.385816 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="init" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.386245 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="5073763c-22c9-44a6-8d65-68497c5c4416" containerName="registry-server" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.386274 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d7b7a2-e6d0-4662-8fb5-e47958fe6a37" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.386308 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb3cf21a-5e60-48dd-bbbe-efb028ad476a" containerName="dnsmasq-dns" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.387638 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.392776 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.393128 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.393333 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.399470 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.428322 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg"] Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.530449 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.530638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.530979 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.531148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88k7\" (UniqueName: \"kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.531304 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.633327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.633419 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88k7\" (UniqueName: \"kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.633485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.633579 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.633667 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.639667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.639764 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.640113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.644136 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.660320 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88k7\" (UniqueName: \"kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:09 crc kubenswrapper[4764]: I1204 01:27:09.742094 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:10 crc kubenswrapper[4764]: I1204 01:27:10.385652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg"] Dec 04 01:27:10 crc kubenswrapper[4764]: I1204 01:27:10.706271 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" event={"ID":"3da912a0-fc02-4542-928b-e77f1fc9367b","Type":"ContainerStarted","Data":"70345345fecd1cac0dc076149bb5e5bdfdc6a3d0785b59b8691ec13176575c3f"} Dec 04 01:27:19 crc kubenswrapper[4764]: I1204 01:27:19.822822 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" event={"ID":"3da912a0-fc02-4542-928b-e77f1fc9367b","Type":"ContainerStarted","Data":"8584f04634ad0258fcb7786deb7289ce9f1a76b2026f2db1e74579465345cd5e"} Dec 04 01:27:19 crc kubenswrapper[4764]: I1204 01:27:19.845907 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" podStartSLOduration=2.296869531 podStartE2EDuration="10.845886448s" podCreationTimestamp="2025-12-04 01:27:09 +0000 UTC" firstStartedPulling="2025-12-04 01:27:10.39983545 +0000 UTC m=+6366.161159871" lastFinishedPulling="2025-12-04 01:27:18.948852377 +0000 UTC m=+6374.710176788" observedRunningTime="2025-12-04 01:27:19.844854883 +0000 UTC m=+6375.606179314" watchObservedRunningTime="2025-12-04 01:27:19.845886448 +0000 UTC m=+6375.607210869" Dec 04 01:27:31 crc kubenswrapper[4764]: I1204 01:27:31.969364 4764 generic.go:334] "Generic (PLEG): container finished" podID="3da912a0-fc02-4542-928b-e77f1fc9367b" containerID="8584f04634ad0258fcb7786deb7289ce9f1a76b2026f2db1e74579465345cd5e" exitCode=0 Dec 04 01:27:31 crc kubenswrapper[4764]: I1204 01:27:31.969426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" event={"ID":"3da912a0-fc02-4542-928b-e77f1fc9367b","Type":"ContainerDied","Data":"8584f04634ad0258fcb7786deb7289ce9f1a76b2026f2db1e74579465345cd5e"} Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.566398 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.619998 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.620076 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q88k7\" (UniqueName: \"kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.620151 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.620311 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.620390 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.628522 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7" (OuterVolumeSpecName: "kube-api-access-q88k7") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b"). InnerVolumeSpecName "kube-api-access-q88k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.629410 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph" (OuterVolumeSpecName: "ceph") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.641138 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:27:33 crc kubenswrapper[4764]: E1204 01:27:33.656021 4764 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key podName:3da912a0-fc02-4542-928b-e77f1fc9367b nodeName:}" failed. No retries permitted until 2025-12-04 01:27:34.155990752 +0000 UTC m=+6389.917315173 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b") : error deleting /var/lib/kubelet/pods/3da912a0-fc02-4542-928b-e77f1fc9367b/volume-subpaths: remove /var/lib/kubelet/pods/3da912a0-fc02-4542-928b-e77f1fc9367b/volume-subpaths: no such file or directory Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.658341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory" (OuterVolumeSpecName: "inventory") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.723052 4764 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.723089 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q88k7\" (UniqueName: \"kubernetes.io/projected/3da912a0-fc02-4542-928b-e77f1fc9367b-kube-api-access-q88k7\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.723102 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.723115 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.994113 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" event={"ID":"3da912a0-fc02-4542-928b-e77f1fc9367b","Type":"ContainerDied","Data":"70345345fecd1cac0dc076149bb5e5bdfdc6a3d0785b59b8691ec13176575c3f"} Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.994426 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70345345fecd1cac0dc076149bb5e5bdfdc6a3d0785b59b8691ec13176575c3f" Dec 04 01:27:33 crc kubenswrapper[4764]: I1204 01:27:33.994174 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg" Dec 04 01:27:34 crc kubenswrapper[4764]: I1204 01:27:34.233614 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") pod \"3da912a0-fc02-4542-928b-e77f1fc9367b\" (UID: \"3da912a0-fc02-4542-928b-e77f1fc9367b\") " Dec 04 01:27:34 crc kubenswrapper[4764]: I1204 01:27:34.237851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3da912a0-fc02-4542-928b-e77f1fc9367b" (UID: "3da912a0-fc02-4542-928b-e77f1fc9367b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:27:34 crc kubenswrapper[4764]: I1204 01:27:34.337219 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3da912a0-fc02-4542-928b-e77f1fc9367b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.382165 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw"] Dec 04 01:27:42 crc kubenswrapper[4764]: E1204 01:27:42.383159 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da912a0-fc02-4542-928b-e77f1fc9367b" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.383174 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da912a0-fc02-4542-928b-e77f1fc9367b" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.383367 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da912a0-fc02-4542-928b-e77f1fc9367b" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.384080 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.386927 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.387216 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.387314 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.393857 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw"] Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.395438 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.453122 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvp5p\" (UniqueName: \"kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.453210 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.453268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.453376 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.453409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.556400 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvp5p\" (UniqueName: \"kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.556541 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.556631 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.556957 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.557032 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.563421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.564334 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.565155 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.565190 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.573326 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvp5p\" (UniqueName: \"kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:42 crc kubenswrapper[4764]: I1204 01:27:42.709899 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:27:43 crc kubenswrapper[4764]: I1204 01:27:43.295773 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw"] Dec 04 01:27:44 crc kubenswrapper[4764]: I1204 01:27:44.128663 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" event={"ID":"4336186f-bad0-463f-8133-1f6d260ab27f","Type":"ContainerStarted","Data":"0c23565bd50f1ec54c33b19db3f9c24df1bacbf58098fb72015a7acaeee8e7ee"} Dec 04 01:27:44 crc kubenswrapper[4764]: I1204 01:27:44.129035 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" event={"ID":"4336186f-bad0-463f-8133-1f6d260ab27f","Type":"ContainerStarted","Data":"a5f0756e03584d093c773ab9521123641137d2e50106ca02f2448c72a1aac36b"} Dec 04 01:27:44 crc kubenswrapper[4764]: I1204 01:27:44.154531 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" podStartSLOduration=1.664653258 podStartE2EDuration="2.154512307s" podCreationTimestamp="2025-12-04 01:27:42 +0000 UTC" firstStartedPulling="2025-12-04 01:27:43.305274063 +0000 UTC m=+6399.066598484" lastFinishedPulling="2025-12-04 01:27:43.795133082 +0000 UTC m=+6399.556457533" observedRunningTime="2025-12-04 01:27:44.147068824 +0000 UTC m=+6399.908393275" watchObservedRunningTime="2025-12-04 01:27:44.154512307 +0000 UTC m=+6399.915836718" Dec 04 01:28:18 crc kubenswrapper[4764]: I1204 01:28:18.900216 4764 scope.go:117] "RemoveContainer" containerID="98673d74fc079017d323e98e905ee9976be556fd095905e6bb3f96d3b4b6f936" Dec 04 01:28:18 crc kubenswrapper[4764]: I1204 01:28:18.940154 4764 scope.go:117] "RemoveContainer" containerID="a66aac1cca762730569a1974f2c207b4d6a6d993651e0c476085bb48fcf43e8c" Dec 04 01:28:19 crc kubenswrapper[4764]: I1204 01:28:19.196846 4764 scope.go:117] "RemoveContainer" containerID="cde867efa9ed1ef62e54ce1b45c11b1742c5a9d91626b5e7f853ee44e29d6bf7" Dec 04 01:28:19 crc kubenswrapper[4764]: I1204 01:28:19.387579 4764 scope.go:117] "RemoveContainer" containerID="dc5810ef8ab98b3a7cfbd3a5acc09babca231c3f501293fe54837901e49b8e4d" Dec 04 01:28:44 crc kubenswrapper[4764]: I1204 01:28:44.042668 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-m2qsb"] Dec 04 01:28:44 crc kubenswrapper[4764]: I1204 01:28:44.055105 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-m2qsb"] Dec 04 01:28:44 crc kubenswrapper[4764]: I1204 01:28:44.570736 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0170b467-741d-4b47-9835-34356313f145" path="/var/lib/kubelet/pods/0170b467-741d-4b47-9835-34356313f145/volumes" Dec 04 01:28:45 crc kubenswrapper[4764]: I1204 01:28:45.036697 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-aa08-account-create-update-p85k9"] Dec 04 01:28:45 crc kubenswrapper[4764]: I1204 01:28:45.052549 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-aa08-account-create-update-p85k9"] Dec 04 01:28:46 crc kubenswrapper[4764]: I1204 01:28:46.568579 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f5ef1bd-61ec-47a9-829f-3214b0907f8e" path="/var/lib/kubelet/pods/3f5ef1bd-61ec-47a9-829f-3214b0907f8e/volumes" Dec 04 01:28:50 crc kubenswrapper[4764]: I1204 01:28:50.043841 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-wt4kx"] Dec 04 01:28:50 crc kubenswrapper[4764]: I1204 01:28:50.053565 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-wt4kx"] Dec 04 01:28:50 crc kubenswrapper[4764]: I1204 01:28:50.558324 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7face56a-a8b7-41d2-86db-75135a9dcaa0" path="/var/lib/kubelet/pods/7face56a-a8b7-41d2-86db-75135a9dcaa0/volumes" Dec 04 01:28:50 crc kubenswrapper[4764]: I1204 01:28:50.869027 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:28:50 crc kubenswrapper[4764]: I1204 01:28:50.869136 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:28:51 crc kubenswrapper[4764]: I1204 01:28:51.041749 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-2042-account-create-update-zv9mp"] Dec 04 01:28:51 crc kubenswrapper[4764]: I1204 01:28:51.054221 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-2042-account-create-update-zv9mp"] Dec 04 01:28:52 crc kubenswrapper[4764]: I1204 01:28:52.565388 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="505e42c6-8666-4acf-adf8-8ec135624e26" path="/var/lib/kubelet/pods/505e42c6-8666-4acf-adf8-8ec135624e26/volumes" Dec 04 01:29:19 crc kubenswrapper[4764]: I1204 01:29:19.463632 4764 scope.go:117] "RemoveContainer" containerID="3129539e767f0a639eabe362500bd0499b3098aaba4b42cdbc80d8ad192059f7" Dec 04 01:29:19 crc kubenswrapper[4764]: I1204 01:29:19.500881 4764 scope.go:117] "RemoveContainer" containerID="a6dd8e8b88abc0a5f9cc045e7bcabb1c793b9c2cb341a9783f70a4be9b5da1dc" Dec 04 01:29:19 crc kubenswrapper[4764]: I1204 01:29:19.563423 4764 scope.go:117] "RemoveContainer" containerID="83aa57f3e0388e223c20288b8b047d9643ab6a922bc1470e93173a66d30e2688" Dec 04 01:29:19 crc kubenswrapper[4764]: I1204 01:29:19.613987 4764 scope.go:117] "RemoveContainer" containerID="64defe28e1b50bea444fc2f226168b0d562b2a0cedcf8264d7a8fb01ec5c76bf" Dec 04 01:29:20 crc kubenswrapper[4764]: I1204 01:29:20.868793 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:29:20 crc kubenswrapper[4764]: I1204 01:29:20.869199 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.167822 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.172005 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.181593 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.353436 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.354194 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffbmp\" (UniqueName: \"kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.354383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.455902 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.456012 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.456062 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffbmp\" (UniqueName: \"kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.456377 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.456452 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.504044 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffbmp\" (UniqueName: \"kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp\") pod \"community-operators-s5pxr\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:27 crc kubenswrapper[4764]: I1204 01:29:27.803809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:28 crc kubenswrapper[4764]: I1204 01:29:28.321235 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:28 crc kubenswrapper[4764]: I1204 01:29:28.468101 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerStarted","Data":"fdc8c5b4faa01ba9d485a50d720ff9fbb9b4ef222dc2abbf757fda851f969da8"} Dec 04 01:29:29 crc kubenswrapper[4764]: I1204 01:29:29.477755 4764 generic.go:334] "Generic (PLEG): container finished" podID="49eca207-6941-4fde-b773-39d265b55691" containerID="1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c" exitCode=0 Dec 04 01:29:29 crc kubenswrapper[4764]: I1204 01:29:29.477885 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerDied","Data":"1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c"} Dec 04 01:29:30 crc kubenswrapper[4764]: I1204 01:29:30.489546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerStarted","Data":"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5"} Dec 04 01:29:31 crc kubenswrapper[4764]: I1204 01:29:31.503870 4764 generic.go:334] "Generic (PLEG): container finished" podID="49eca207-6941-4fde-b773-39d265b55691" containerID="50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5" exitCode=0 Dec 04 01:29:31 crc kubenswrapper[4764]: I1204 01:29:31.503973 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerDied","Data":"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5"} Dec 04 01:29:33 crc kubenswrapper[4764]: I1204 01:29:33.040992 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-ld9jx"] Dec 04 01:29:33 crc kubenswrapper[4764]: I1204 01:29:33.051083 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-ld9jx"] Dec 04 01:29:33 crc kubenswrapper[4764]: I1204 01:29:33.525088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerStarted","Data":"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63"} Dec 04 01:29:33 crc kubenswrapper[4764]: I1204 01:29:33.551348 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s5pxr" podStartSLOduration=3.748527713 podStartE2EDuration="6.551326968s" podCreationTimestamp="2025-12-04 01:29:27 +0000 UTC" firstStartedPulling="2025-12-04 01:29:29.481301749 +0000 UTC m=+6505.242626170" lastFinishedPulling="2025-12-04 01:29:32.284101004 +0000 UTC m=+6508.045425425" observedRunningTime="2025-12-04 01:29:33.543113676 +0000 UTC m=+6509.304438127" watchObservedRunningTime="2025-12-04 01:29:33.551326968 +0000 UTC m=+6509.312651389" Dec 04 01:29:34 crc kubenswrapper[4764]: I1204 01:29:34.562852 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b221cfbc-e72d-4f2b-80b5-d32c11e2f963" path="/var/lib/kubelet/pods/b221cfbc-e72d-4f2b-80b5-d32c11e2f963/volumes" Dec 04 01:29:37 crc kubenswrapper[4764]: I1204 01:29:37.804073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:37 crc kubenswrapper[4764]: I1204 01:29:37.804624 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:37 crc kubenswrapper[4764]: I1204 01:29:37.875598 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:38 crc kubenswrapper[4764]: I1204 01:29:38.634569 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:38 crc kubenswrapper[4764]: I1204 01:29:38.720816 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:40 crc kubenswrapper[4764]: I1204 01:29:40.595078 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s5pxr" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="registry-server" containerID="cri-o://e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63" gracePeriod=2 Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.049775 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.229368 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content\") pod \"49eca207-6941-4fde-b773-39d265b55691\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.229544 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities\") pod \"49eca207-6941-4fde-b773-39d265b55691\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.229678 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffbmp\" (UniqueName: \"kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp\") pod \"49eca207-6941-4fde-b773-39d265b55691\" (UID: \"49eca207-6941-4fde-b773-39d265b55691\") " Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.230981 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities" (OuterVolumeSpecName: "utilities") pod "49eca207-6941-4fde-b773-39d265b55691" (UID: "49eca207-6941-4fde-b773-39d265b55691"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.235461 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp" (OuterVolumeSpecName: "kube-api-access-ffbmp") pod "49eca207-6941-4fde-b773-39d265b55691" (UID: "49eca207-6941-4fde-b773-39d265b55691"). InnerVolumeSpecName "kube-api-access-ffbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.295694 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49eca207-6941-4fde-b773-39d265b55691" (UID: "49eca207-6941-4fde-b773-39d265b55691"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.333184 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.333252 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffbmp\" (UniqueName: \"kubernetes.io/projected/49eca207-6941-4fde-b773-39d265b55691-kube-api-access-ffbmp\") on node \"crc\" DevicePath \"\"" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.333285 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49eca207-6941-4fde-b773-39d265b55691-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.608094 4764 generic.go:334] "Generic (PLEG): container finished" podID="49eca207-6941-4fde-b773-39d265b55691" containerID="e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63" exitCode=0 Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.608145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerDied","Data":"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63"} Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.608174 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s5pxr" event={"ID":"49eca207-6941-4fde-b773-39d265b55691","Type":"ContainerDied","Data":"fdc8c5b4faa01ba9d485a50d720ff9fbb9b4ef222dc2abbf757fda851f969da8"} Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.608194 4764 scope.go:117] "RemoveContainer" containerID="e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.608335 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s5pxr" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.643843 4764 scope.go:117] "RemoveContainer" containerID="50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.657560 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.668697 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s5pxr"] Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.669905 4764 scope.go:117] "RemoveContainer" containerID="1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.743053 4764 scope.go:117] "RemoveContainer" containerID="e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63" Dec 04 01:29:41 crc kubenswrapper[4764]: E1204 01:29:41.743507 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63\": container with ID starting with e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63 not found: ID does not exist" containerID="e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.743540 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63"} err="failed to get container status \"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63\": rpc error: code = NotFound desc = could not find container \"e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63\": container with ID starting with e728b76ed51f703933edd04f815977eea0c3d686e5da574c9f8f0b27bc10cd63 not found: ID does not exist" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.743593 4764 scope.go:117] "RemoveContainer" containerID="50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5" Dec 04 01:29:41 crc kubenswrapper[4764]: E1204 01:29:41.743905 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5\": container with ID starting with 50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5 not found: ID does not exist" containerID="50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.743955 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5"} err="failed to get container status \"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5\": rpc error: code = NotFound desc = could not find container \"50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5\": container with ID starting with 50e65d6b2aa9ae1c65877e72d72c09beb360d7907c4b24e66481f2145a3981f5 not found: ID does not exist" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.743985 4764 scope.go:117] "RemoveContainer" containerID="1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c" Dec 04 01:29:41 crc kubenswrapper[4764]: E1204 01:29:41.744508 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c\": container with ID starting with 1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c not found: ID does not exist" containerID="1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c" Dec 04 01:29:41 crc kubenswrapper[4764]: I1204 01:29:41.744546 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c"} err="failed to get container status \"1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c\": rpc error: code = NotFound desc = could not find container \"1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c\": container with ID starting with 1d958f6ad1f7fa0f635d21db365d078445dc03848d8b4b0dd53dde324709ef9c not found: ID does not exist" Dec 04 01:29:42 crc kubenswrapper[4764]: I1204 01:29:42.559095 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49eca207-6941-4fde-b773-39d265b55691" path="/var/lib/kubelet/pods/49eca207-6941-4fde-b773-39d265b55691/volumes" Dec 04 01:29:50 crc kubenswrapper[4764]: I1204 01:29:50.868631 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:29:50 crc kubenswrapper[4764]: I1204 01:29:50.869248 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:29:50 crc kubenswrapper[4764]: I1204 01:29:50.869302 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:29:50 crc kubenswrapper[4764]: I1204 01:29:50.870216 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:29:50 crc kubenswrapper[4764]: I1204 01:29:50.870282 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318" gracePeriod=600 Dec 04 01:29:51 crc kubenswrapper[4764]: I1204 01:29:51.748264 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318" exitCode=0 Dec 04 01:29:51 crc kubenswrapper[4764]: I1204 01:29:51.749197 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318"} Dec 04 01:29:51 crc kubenswrapper[4764]: I1204 01:29:51.749240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70"} Dec 04 01:29:51 crc kubenswrapper[4764]: I1204 01:29:51.749261 4764 scope.go:117] "RemoveContainer" containerID="674c7df746267f828b028a7e60cf6f516198ff1f44f6e3ce78f17e7eaebddff3" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.160852 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv"] Dec 04 01:30:00 crc kubenswrapper[4764]: E1204 01:30:00.162010 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="extract-content" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.162027 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="extract-content" Dec 04 01:30:00 crc kubenswrapper[4764]: E1204 01:30:00.162062 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="registry-server" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.162068 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="registry-server" Dec 04 01:30:00 crc kubenswrapper[4764]: E1204 01:30:00.162094 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="extract-utilities" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.162101 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="extract-utilities" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.162340 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="49eca207-6941-4fde-b773-39d265b55691" containerName="registry-server" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.163285 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.167139 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.167182 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.170606 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv"] Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.331687 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.332101 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxt7m\" (UniqueName: \"kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.332286 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.434004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.434056 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.434090 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxt7m\" (UniqueName: \"kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.435384 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.445451 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.457608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxt7m\" (UniqueName: \"kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m\") pod \"collect-profiles-29413530-6cjzv\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:00 crc kubenswrapper[4764]: I1204 01:30:00.489535 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:01 crc kubenswrapper[4764]: I1204 01:30:01.061639 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv"] Dec 04 01:30:01 crc kubenswrapper[4764]: I1204 01:30:01.868224 4764 generic.go:334] "Generic (PLEG): container finished" podID="8292e040-cca3-4f5c-817d-4c4c9f479e1f" containerID="e294fb91d22777c68fadea25804d715dfc95714ba1460292ac4ebf82ded0e421" exitCode=0 Dec 04 01:30:01 crc kubenswrapper[4764]: I1204 01:30:01.868324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" event={"ID":"8292e040-cca3-4f5c-817d-4c4c9f479e1f","Type":"ContainerDied","Data":"e294fb91d22777c68fadea25804d715dfc95714ba1460292ac4ebf82ded0e421"} Dec 04 01:30:01 crc kubenswrapper[4764]: I1204 01:30:01.868364 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" event={"ID":"8292e040-cca3-4f5c-817d-4c4c9f479e1f","Type":"ContainerStarted","Data":"70817675662bcbbe92f050b645f43fc02644dbbfbcd7c973a381687cde2625ec"} Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.345821 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.507641 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume\") pod \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.507758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxt7m\" (UniqueName: \"kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m\") pod \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.508100 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume\") pod \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\" (UID: \"8292e040-cca3-4f5c-817d-4c4c9f479e1f\") " Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.508827 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "8292e040-cca3-4f5c-817d-4c4c9f479e1f" (UID: "8292e040-cca3-4f5c-817d-4c4c9f479e1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.513093 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8292e040-cca3-4f5c-817d-4c4c9f479e1f" (UID: "8292e040-cca3-4f5c-817d-4c4c9f479e1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.522589 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m" (OuterVolumeSpecName: "kube-api-access-wxt7m") pod "8292e040-cca3-4f5c-817d-4c4c9f479e1f" (UID: "8292e040-cca3-4f5c-817d-4c4c9f479e1f"). InnerVolumeSpecName "kube-api-access-wxt7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.612376 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8292e040-cca3-4f5c-817d-4c4c9f479e1f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.612416 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxt7m\" (UniqueName: \"kubernetes.io/projected/8292e040-cca3-4f5c-817d-4c4c9f479e1f-kube-api-access-wxt7m\") on node \"crc\" DevicePath \"\"" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.612427 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8292e040-cca3-4f5c-817d-4c4c9f479e1f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.900021 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" event={"ID":"8292e040-cca3-4f5c-817d-4c4c9f479e1f","Type":"ContainerDied","Data":"70817675662bcbbe92f050b645f43fc02644dbbfbcd7c973a381687cde2625ec"} Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.900456 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70817675662bcbbe92f050b645f43fc02644dbbfbcd7c973a381687cde2625ec" Dec 04 01:30:03 crc kubenswrapper[4764]: I1204 01:30:03.900266 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv" Dec 04 01:30:04 crc kubenswrapper[4764]: I1204 01:30:04.439345 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7"] Dec 04 01:30:04 crc kubenswrapper[4764]: I1204 01:30:04.454188 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413485-fvkm7"] Dec 04 01:30:04 crc kubenswrapper[4764]: I1204 01:30:04.560802 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54cc2674-47e4-4b68-b220-1a68ada9eba8" path="/var/lib/kubelet/pods/54cc2674-47e4-4b68-b220-1a68ada9eba8/volumes" Dec 04 01:30:19 crc kubenswrapper[4764]: I1204 01:30:19.791323 4764 scope.go:117] "RemoveContainer" containerID="1f17cd099233467c2a8b412f29eb98c130b798b5362256d3af61562971672cef" Dec 04 01:30:19 crc kubenswrapper[4764]: I1204 01:30:19.859224 4764 scope.go:117] "RemoveContainer" containerID="3abb55e315453663e15ef4b9a85d5f9d785a4a06bef7407ecae0db2141dce5df" Dec 04 01:30:19 crc kubenswrapper[4764]: I1204 01:30:19.948400 4764 scope.go:117] "RemoveContainer" containerID="a7a23246d03c143486966bb236268e1f1dc091660e349e2e3494d069d1434159" Dec 04 01:31:20 crc kubenswrapper[4764]: I1204 01:31:20.114909 4764 scope.go:117] "RemoveContainer" containerID="2d1ee2d0255ef7fad12415a428ea5ab3d5ab3bc8ef22c5e73e2b933adcc1bd9c" Dec 04 01:31:20 crc kubenswrapper[4764]: I1204 01:31:20.136798 4764 scope.go:117] "RemoveContainer" containerID="ee5e227bfdecbc2e04e1736aacac38721fe698bf6d703eb2c8b3733736b87495" Dec 04 01:31:20 crc kubenswrapper[4764]: I1204 01:31:20.166256 4764 scope.go:117] "RemoveContainer" containerID="d6f64d9129f0da2fbaed7bcfc7ae498d93f0ffb3a19d68ad4491ee4bac16d0f7" Dec 04 01:31:20 crc kubenswrapper[4764]: I1204 01:31:20.235753 4764 scope.go:117] "RemoveContainer" containerID="cf29d6ae6d16c38f90e772a95eaf9cf0f4f4ed6a2c49a23f65c44ef8d4f4dad6" Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.101395 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-mzhfh"] Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.142821 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-mzhfh"] Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.157268 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-b5cc-account-create-update-4nx8x"] Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.183916 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-b5cc-account-create-update-4nx8x"] Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.561389 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="322bc7d6-4e25-4535-81ba-59aff3f7331a" path="/var/lib/kubelet/pods/322bc7d6-4e25-4535-81ba-59aff3f7331a/volumes" Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.562921 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7e7fb83-18af-40c9-907f-284ed3a95843" path="/var/lib/kubelet/pods/b7e7fb83-18af-40c9-907f-284ed3a95843/volumes" Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.868492 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:32:20 crc kubenswrapper[4764]: I1204 01:32:20.868551 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.867616 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:29 crc kubenswrapper[4764]: E1204 01:32:29.868733 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8292e040-cca3-4f5c-817d-4c4c9f479e1f" containerName="collect-profiles" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.868751 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="8292e040-cca3-4f5c-817d-4c4c9f479e1f" containerName="collect-profiles" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.869582 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="8292e040-cca3-4f5c-817d-4c4c9f479e1f" containerName="collect-profiles" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.871135 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.899461 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.984976 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.985075 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:29 crc kubenswrapper[4764]: I1204 01:32:29.985209 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxtj\" (UniqueName: \"kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.087304 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxtj\" (UniqueName: \"kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.087806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.087856 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.088292 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.088300 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.116615 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxtj\" (UniqueName: \"kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj\") pod \"redhat-marketplace-ms7h5\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.194781 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.667437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:30 crc kubenswrapper[4764]: W1204 01:32:30.679352 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf655d039_114b_45f4_955a_ce98d0fadf0c.slice/crio-827a9f1a4c01b4ef2866ae5eb6beec3857d3556fd52182d849bab66e5c750b37 WatchSource:0}: Error finding container 827a9f1a4c01b4ef2866ae5eb6beec3857d3556fd52182d849bab66e5c750b37: Status 404 returned error can't find the container with id 827a9f1a4c01b4ef2866ae5eb6beec3857d3556fd52182d849bab66e5c750b37 Dec 04 01:32:30 crc kubenswrapper[4764]: I1204 01:32:30.740204 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerStarted","Data":"827a9f1a4c01b4ef2866ae5eb6beec3857d3556fd52182d849bab66e5c750b37"} Dec 04 01:32:31 crc kubenswrapper[4764]: I1204 01:32:31.757092 4764 generic.go:334] "Generic (PLEG): container finished" podID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerID="107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7" exitCode=0 Dec 04 01:32:31 crc kubenswrapper[4764]: I1204 01:32:31.757173 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerDied","Data":"107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7"} Dec 04 01:32:31 crc kubenswrapper[4764]: I1204 01:32:31.760385 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:32:32 crc kubenswrapper[4764]: I1204 01:32:32.044617 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-c2wc5"] Dec 04 01:32:32 crc kubenswrapper[4764]: I1204 01:32:32.056312 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-c2wc5"] Dec 04 01:32:32 crc kubenswrapper[4764]: I1204 01:32:32.561892 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d619f27d-9dc1-4cbf-8fab-8085f9521299" path="/var/lib/kubelet/pods/d619f27d-9dc1-4cbf-8fab-8085f9521299/volumes" Dec 04 01:32:32 crc kubenswrapper[4764]: I1204 01:32:32.769565 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerStarted","Data":"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7"} Dec 04 01:32:33 crc kubenswrapper[4764]: I1204 01:32:33.787580 4764 generic.go:334] "Generic (PLEG): container finished" podID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerID="7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7" exitCode=0 Dec 04 01:32:33 crc kubenswrapper[4764]: I1204 01:32:33.787655 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerDied","Data":"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7"} Dec 04 01:32:34 crc kubenswrapper[4764]: I1204 01:32:34.801267 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerStarted","Data":"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995"} Dec 04 01:32:34 crc kubenswrapper[4764]: I1204 01:32:34.836334 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ms7h5" podStartSLOduration=3.427996819 podStartE2EDuration="5.836317108s" podCreationTimestamp="2025-12-04 01:32:29 +0000 UTC" firstStartedPulling="2025-12-04 01:32:31.759819322 +0000 UTC m=+6687.521143763" lastFinishedPulling="2025-12-04 01:32:34.168139631 +0000 UTC m=+6689.929464052" observedRunningTime="2025-12-04 01:32:34.831026448 +0000 UTC m=+6690.592350859" watchObservedRunningTime="2025-12-04 01:32:34.836317108 +0000 UTC m=+6690.597641519" Dec 04 01:32:40 crc kubenswrapper[4764]: I1204 01:32:40.195049 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:40 crc kubenswrapper[4764]: I1204 01:32:40.196691 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:40 crc kubenswrapper[4764]: I1204 01:32:40.262260 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:40 crc kubenswrapper[4764]: I1204 01:32:40.962795 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:41 crc kubenswrapper[4764]: I1204 01:32:41.029535 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:42 crc kubenswrapper[4764]: I1204 01:32:42.908033 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ms7h5" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="registry-server" containerID="cri-o://8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995" gracePeriod=2 Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.478642 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.611205 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxtj\" (UniqueName: \"kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj\") pod \"f655d039-114b-45f4-955a-ce98d0fadf0c\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.611452 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content\") pod \"f655d039-114b-45f4-955a-ce98d0fadf0c\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.611542 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities\") pod \"f655d039-114b-45f4-955a-ce98d0fadf0c\" (UID: \"f655d039-114b-45f4-955a-ce98d0fadf0c\") " Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.612527 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities" (OuterVolumeSpecName: "utilities") pod "f655d039-114b-45f4-955a-ce98d0fadf0c" (UID: "f655d039-114b-45f4-955a-ce98d0fadf0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.617174 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj" (OuterVolumeSpecName: "kube-api-access-msxtj") pod "f655d039-114b-45f4-955a-ce98d0fadf0c" (UID: "f655d039-114b-45f4-955a-ce98d0fadf0c"). InnerVolumeSpecName "kube-api-access-msxtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.640554 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f655d039-114b-45f4-955a-ce98d0fadf0c" (UID: "f655d039-114b-45f4-955a-ce98d0fadf0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.716765 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.716801 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxtj\" (UniqueName: \"kubernetes.io/projected/f655d039-114b-45f4-955a-ce98d0fadf0c-kube-api-access-msxtj\") on node \"crc\" DevicePath \"\"" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.716814 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f655d039-114b-45f4-955a-ce98d0fadf0c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.938169 4764 generic.go:334] "Generic (PLEG): container finished" podID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerID="8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995" exitCode=0 Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.938226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerDied","Data":"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995"} Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.938245 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ms7h5" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.938264 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ms7h5" event={"ID":"f655d039-114b-45f4-955a-ce98d0fadf0c","Type":"ContainerDied","Data":"827a9f1a4c01b4ef2866ae5eb6beec3857d3556fd52182d849bab66e5c750b37"} Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.938324 4764 scope.go:117] "RemoveContainer" containerID="8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.969232 4764 scope.go:117] "RemoveContainer" containerID="7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7" Dec 04 01:32:43 crc kubenswrapper[4764]: I1204 01:32:43.996311 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.009768 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ms7h5"] Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.051900 4764 scope.go:117] "RemoveContainer" containerID="107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.096380 4764 scope.go:117] "RemoveContainer" containerID="8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995" Dec 04 01:32:44 crc kubenswrapper[4764]: E1204 01:32:44.097101 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995\": container with ID starting with 8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995 not found: ID does not exist" containerID="8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.097150 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995"} err="failed to get container status \"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995\": rpc error: code = NotFound desc = could not find container \"8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995\": container with ID starting with 8c95dec4357fee311574e8ae92bf38e3acf712c12c0ff612e0c4343a7cf24995 not found: ID does not exist" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.097176 4764 scope.go:117] "RemoveContainer" containerID="7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7" Dec 04 01:32:44 crc kubenswrapper[4764]: E1204 01:32:44.097628 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7\": container with ID starting with 7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7 not found: ID does not exist" containerID="7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.097677 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7"} err="failed to get container status \"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7\": rpc error: code = NotFound desc = could not find container \"7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7\": container with ID starting with 7441e76e0e42f9ea79ab71cc08294e48bd075344f2e595d5b008dbaecd6871b7 not found: ID does not exist" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.097709 4764 scope.go:117] "RemoveContainer" containerID="107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7" Dec 04 01:32:44 crc kubenswrapper[4764]: E1204 01:32:44.098079 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7\": container with ID starting with 107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7 not found: ID does not exist" containerID="107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.098111 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7"} err="failed to get container status \"107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7\": rpc error: code = NotFound desc = could not find container \"107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7\": container with ID starting with 107a5453150d85253827710df53fb4de522352fe5629a9ef90616c1541383de7 not found: ID does not exist" Dec 04 01:32:44 crc kubenswrapper[4764]: I1204 01:32:44.563864 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" path="/var/lib/kubelet/pods/f655d039-114b-45f4-955a-ce98d0fadf0c/volumes" Dec 04 01:32:50 crc kubenswrapper[4764]: I1204 01:32:50.869090 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:32:50 crc kubenswrapper[4764]: I1204 01:32:50.869841 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.308803 4764 scope.go:117] "RemoveContainer" containerID="15d6739e5c42d314ac6d1c65fe548e4b28462c5f91099b8acce1e4219dd6a9e4" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.346867 4764 scope.go:117] "RemoveContainer" containerID="44951a39c42c818dfc0a6fefd145c77a8e4abdf5b94f28e4385ce10df458e861" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.417773 4764 scope.go:117] "RemoveContainer" containerID="b8452a0ea41802f202e27b2709c0c0c312581e4db07df2cf52cab1d18b11bba5" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.450367 4764 scope.go:117] "RemoveContainer" containerID="bb436f9fbb0ed12578ee66f971d6b39b3612e70cdff71fd02e1d05acd38bd47b" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.537787 4764 scope.go:117] "RemoveContainer" containerID="ec852aa21f647bbf1714c52d827fc9cfccb6605fd13807cc78b9555f6c453729" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.868278 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.868554 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.868595 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.869357 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:33:20 crc kubenswrapper[4764]: I1204 01:33:20.869409 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" gracePeriod=600 Dec 04 01:33:21 crc kubenswrapper[4764]: E1204 01:33:21.012433 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:33:21 crc kubenswrapper[4764]: I1204 01:33:21.445514 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" exitCode=0 Dec 04 01:33:21 crc kubenswrapper[4764]: I1204 01:33:21.445577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70"} Dec 04 01:33:21 crc kubenswrapper[4764]: I1204 01:33:21.445620 4764 scope.go:117] "RemoveContainer" containerID="18353771820ed602c02a4b51e87e1175f18312a8cd5f9f7961cb594ac2bf8318" Dec 04 01:33:21 crc kubenswrapper[4764]: I1204 01:33:21.446797 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:33:21 crc kubenswrapper[4764]: E1204 01:33:21.447360 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:33:32 crc kubenswrapper[4764]: I1204 01:33:32.545899 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:33:32 crc kubenswrapper[4764]: E1204 01:33:32.546871 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:33:46 crc kubenswrapper[4764]: I1204 01:33:46.547474 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:33:46 crc kubenswrapper[4764]: E1204 01:33:46.550326 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:34:00 crc kubenswrapper[4764]: I1204 01:34:00.545701 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:34:00 crc kubenswrapper[4764]: E1204 01:34:00.546968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:34:15 crc kubenswrapper[4764]: I1204 01:34:15.546541 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:34:15 crc kubenswrapper[4764]: E1204 01:34:15.547947 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:34:30 crc kubenswrapper[4764]: I1204 01:34:30.545539 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:34:30 crc kubenswrapper[4764]: E1204 01:34:30.546435 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:34:44 crc kubenswrapper[4764]: I1204 01:34:44.546534 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:34:44 crc kubenswrapper[4764]: E1204 01:34:44.548311 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.078092 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-5228-account-create-update-wvbk8"] Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.092608 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-84nsz"] Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.105643 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-5228-account-create-update-wvbk8"] Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.118173 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-84nsz"] Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.561908 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="115f6f75-7a37-4896-881f-f937b03faf91" path="/var/lib/kubelet/pods/115f6f75-7a37-4896-881f-f937b03faf91/volumes" Dec 04 01:34:50 crc kubenswrapper[4764]: I1204 01:34:50.563128 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa72ee03-bea2-443c-babc-8a4319e2fc39" path="/var/lib/kubelet/pods/aa72ee03-bea2-443c-babc-8a4319e2fc39/volumes" Dec 04 01:34:57 crc kubenswrapper[4764]: I1204 01:34:57.552405 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:34:57 crc kubenswrapper[4764]: E1204 01:34:57.553185 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:35:01 crc kubenswrapper[4764]: I1204 01:35:01.043319 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8s2jj"] Dec 04 01:35:01 crc kubenswrapper[4764]: I1204 01:35:01.055519 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8s2jj"] Dec 04 01:35:02 crc kubenswrapper[4764]: I1204 01:35:02.558475 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1273966b-f3cd-4958-b6e0-9d83312bdec7" path="/var/lib/kubelet/pods/1273966b-f3cd-4958-b6e0-9d83312bdec7/volumes" Dec 04 01:35:11 crc kubenswrapper[4764]: I1204 01:35:11.546266 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:35:11 crc kubenswrapper[4764]: E1204 01:35:11.547138 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.062157 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-zjp24"] Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.086357 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-f6b6-account-create-update-ck66f"] Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.105652 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-zjp24"] Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.117416 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-f6b6-account-create-update-ck66f"] Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.569254 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17370d13-d0d9-4908-9f90-723405035fdd" path="/var/lib/kubelet/pods/17370d13-d0d9-4908-9f90-723405035fdd/volumes" Dec 04 01:35:18 crc kubenswrapper[4764]: I1204 01:35:18.570688 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844be069-36ba-49ad-8555-567a2086a7fc" path="/var/lib/kubelet/pods/844be069-36ba-49ad-8555-567a2086a7fc/volumes" Dec 04 01:35:20 crc kubenswrapper[4764]: I1204 01:35:20.708236 4764 scope.go:117] "RemoveContainer" containerID="8000c7995b70966676325d1ccf65e810ecfecc4afce584b36044c5651a63b1d6" Dec 04 01:35:20 crc kubenswrapper[4764]: I1204 01:35:20.744955 4764 scope.go:117] "RemoveContainer" containerID="530c24ba22ae9a5a240e7bb5e4eb292cfccf4d731adb333f7c51a23f6a673ce0" Dec 04 01:35:20 crc kubenswrapper[4764]: I1204 01:35:20.831593 4764 scope.go:117] "RemoveContainer" containerID="cf0843dcade991be0d86caf8e615dbff354c0e6c2508beb52bd02b99a22b5b0b" Dec 04 01:35:20 crc kubenswrapper[4764]: I1204 01:35:20.914965 4764 scope.go:117] "RemoveContainer" containerID="0bc9bac836e74dd1e5255df725e4876f27eae97676c9890cf6440fb685df72dc" Dec 04 01:35:20 crc kubenswrapper[4764]: I1204 01:35:20.970997 4764 scope.go:117] "RemoveContainer" containerID="a750f214d3eb5894856b448c3c71b34aec87d9766ab9bd653437ae1c9c2caeb8" Dec 04 01:35:23 crc kubenswrapper[4764]: I1204 01:35:23.547040 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:35:23 crc kubenswrapper[4764]: E1204 01:35:23.547689 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:35:30 crc kubenswrapper[4764]: I1204 01:35:30.047428 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-k6pbq"] Dec 04 01:35:30 crc kubenswrapper[4764]: I1204 01:35:30.061609 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-k6pbq"] Dec 04 01:35:30 crc kubenswrapper[4764]: I1204 01:35:30.561919 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e554d3-1f76-450f-a5f5-0a350bddb83b" path="/var/lib/kubelet/pods/82e554d3-1f76-450f-a5f5-0a350bddb83b/volumes" Dec 04 01:35:37 crc kubenswrapper[4764]: I1204 01:35:37.545680 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:35:37 crc kubenswrapper[4764]: E1204 01:35:37.546460 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:35:50 crc kubenswrapper[4764]: I1204 01:35:50.546125 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:35:50 crc kubenswrapper[4764]: E1204 01:35:50.546861 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:36:05 crc kubenswrapper[4764]: I1204 01:36:05.545771 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:36:05 crc kubenswrapper[4764]: E1204 01:36:05.546665 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:36:17 crc kubenswrapper[4764]: I1204 01:36:17.546136 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:36:17 crc kubenswrapper[4764]: E1204 01:36:17.546862 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:36:21 crc kubenswrapper[4764]: I1204 01:36:21.119634 4764 scope.go:117] "RemoveContainer" containerID="58f50321737b52e89ec250653ba7284f489e2bc53c2b2d3e749c453f03fb7bad" Dec 04 01:36:32 crc kubenswrapper[4764]: I1204 01:36:32.546020 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:36:32 crc kubenswrapper[4764]: E1204 01:36:32.546881 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:36:47 crc kubenswrapper[4764]: I1204 01:36:47.546298 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:36:47 crc kubenswrapper[4764]: E1204 01:36:47.547295 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:01 crc kubenswrapper[4764]: I1204 01:37:01.545996 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:37:01 crc kubenswrapper[4764]: E1204 01:37:01.546821 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:16 crc kubenswrapper[4764]: I1204 01:37:16.547221 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:37:16 crc kubenswrapper[4764]: E1204 01:37:16.548533 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.429844 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:20 crc kubenswrapper[4764]: E1204 01:37:20.431210 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="extract-utilities" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.431234 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="extract-utilities" Dec 04 01:37:20 crc kubenswrapper[4764]: E1204 01:37:20.431283 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="extract-content" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.431296 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="extract-content" Dec 04 01:37:20 crc kubenswrapper[4764]: E1204 01:37:20.431313 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="registry-server" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.431324 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="registry-server" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.431742 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f655d039-114b-45f4-955a-ce98d0fadf0c" containerName="registry-server" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.434208 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.445100 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.565475 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8hdj\" (UniqueName: \"kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.565778 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.565800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.607117 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.615770 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.624652 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669467 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8hdj\" (UniqueName: \"kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669554 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669603 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669670 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnknw\" (UniqueName: \"kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.669708 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.670273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.670307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.693612 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8hdj\" (UniqueName: \"kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj\") pod \"redhat-operators-ccgjw\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.770183 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.771046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.771132 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnknw\" (UniqueName: \"kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.771172 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.771645 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.771708 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.805423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnknw\" (UniqueName: \"kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw\") pod \"certified-operators-spws6\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:20 crc kubenswrapper[4764]: I1204 01:37:20.938400 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:21 crc kubenswrapper[4764]: I1204 01:37:21.255681 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:21 crc kubenswrapper[4764]: I1204 01:37:21.471884 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:21 crc kubenswrapper[4764]: W1204 01:37:21.473870 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod072ec496_edad_47a6_bab8_9bbf5145f86f.slice/crio-22f1220be40aef3da9644910968fc02b68fda893959e44c11c2c04c7508519f1 WatchSource:0}: Error finding container 22f1220be40aef3da9644910968fc02b68fda893959e44c11c2c04c7508519f1: Status 404 returned error can't find the container with id 22f1220be40aef3da9644910968fc02b68fda893959e44c11c2c04c7508519f1 Dec 04 01:37:21 crc kubenswrapper[4764]: I1204 01:37:21.473972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerStarted","Data":"4ad303dc7dff36ea55f74725023d3221fb5fdcd2da673fc5ac4e3e64c4c8305f"} Dec 04 01:37:22 crc kubenswrapper[4764]: I1204 01:37:22.489606 4764 generic.go:334] "Generic (PLEG): container finished" podID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerID="95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140" exitCode=0 Dec 04 01:37:22 crc kubenswrapper[4764]: I1204 01:37:22.489788 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerDied","Data":"95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140"} Dec 04 01:37:22 crc kubenswrapper[4764]: I1204 01:37:22.490094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerStarted","Data":"22f1220be40aef3da9644910968fc02b68fda893959e44c11c2c04c7508519f1"} Dec 04 01:37:22 crc kubenswrapper[4764]: I1204 01:37:22.491764 4764 generic.go:334] "Generic (PLEG): container finished" podID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerID="73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6" exitCode=0 Dec 04 01:37:22 crc kubenswrapper[4764]: I1204 01:37:22.491809 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerDied","Data":"73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6"} Dec 04 01:37:23 crc kubenswrapper[4764]: I1204 01:37:23.505063 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerStarted","Data":"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9"} Dec 04 01:37:23 crc kubenswrapper[4764]: I1204 01:37:23.507334 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerStarted","Data":"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b"} Dec 04 01:37:25 crc kubenswrapper[4764]: I1204 01:37:25.538922 4764 generic.go:334] "Generic (PLEG): container finished" podID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerID="c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b" exitCode=0 Dec 04 01:37:25 crc kubenswrapper[4764]: I1204 01:37:25.539037 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerDied","Data":"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b"} Dec 04 01:37:27 crc kubenswrapper[4764]: I1204 01:37:27.547151 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:37:27 crc kubenswrapper[4764]: E1204 01:37:27.548036 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:27 crc kubenswrapper[4764]: I1204 01:37:27.565540 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerStarted","Data":"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740"} Dec 04 01:37:27 crc kubenswrapper[4764]: I1204 01:37:27.567741 4764 generic.go:334] "Generic (PLEG): container finished" podID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerID="e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9" exitCode=0 Dec 04 01:37:27 crc kubenswrapper[4764]: I1204 01:37:27.567782 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerDied","Data":"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9"} Dec 04 01:37:27 crc kubenswrapper[4764]: I1204 01:37:27.589401 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-spws6" podStartSLOduration=3.592700223 podStartE2EDuration="7.589381698s" podCreationTimestamp="2025-12-04 01:37:20 +0000 UTC" firstStartedPulling="2025-12-04 01:37:22.492161795 +0000 UTC m=+6978.253486206" lastFinishedPulling="2025-12-04 01:37:26.48884323 +0000 UTC m=+6982.250167681" observedRunningTime="2025-12-04 01:37:27.585865362 +0000 UTC m=+6983.347189763" watchObservedRunningTime="2025-12-04 01:37:27.589381698 +0000 UTC m=+6983.350706109" Dec 04 01:37:28 crc kubenswrapper[4764]: I1204 01:37:28.578889 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerStarted","Data":"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653"} Dec 04 01:37:28 crc kubenswrapper[4764]: I1204 01:37:28.599404 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ccgjw" podStartSLOduration=3.132674831 podStartE2EDuration="8.599389329s" podCreationTimestamp="2025-12-04 01:37:20 +0000 UTC" firstStartedPulling="2025-12-04 01:37:22.499043245 +0000 UTC m=+6978.260367656" lastFinishedPulling="2025-12-04 01:37:27.965757743 +0000 UTC m=+6983.727082154" observedRunningTime="2025-12-04 01:37:28.595664017 +0000 UTC m=+6984.356988428" watchObservedRunningTime="2025-12-04 01:37:28.599389329 +0000 UTC m=+6984.360713740" Dec 04 01:37:30 crc kubenswrapper[4764]: I1204 01:37:30.771166 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:30 crc kubenswrapper[4764]: I1204 01:37:30.771580 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:30 crc kubenswrapper[4764]: I1204 01:37:30.939141 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:30 crc kubenswrapper[4764]: I1204 01:37:30.939673 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:31 crc kubenswrapper[4764]: I1204 01:37:31.004743 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:31 crc kubenswrapper[4764]: I1204 01:37:31.673298 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:31 crc kubenswrapper[4764]: I1204 01:37:31.835831 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ccgjw" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="registry-server" probeResult="failure" output=< Dec 04 01:37:31 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:37:31 crc kubenswrapper[4764]: > Dec 04 01:37:32 crc kubenswrapper[4764]: I1204 01:37:32.195572 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:33 crc kubenswrapper[4764]: I1204 01:37:33.629199 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-spws6" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="registry-server" containerID="cri-o://a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740" gracePeriod=2 Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.206699 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.368313 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities\") pod \"072ec496-edad-47a6-bab8-9bbf5145f86f\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.368684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content\") pod \"072ec496-edad-47a6-bab8-9bbf5145f86f\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.368813 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnknw\" (UniqueName: \"kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw\") pod \"072ec496-edad-47a6-bab8-9bbf5145f86f\" (UID: \"072ec496-edad-47a6-bab8-9bbf5145f86f\") " Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.369557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities" (OuterVolumeSpecName: "utilities") pod "072ec496-edad-47a6-bab8-9bbf5145f86f" (UID: "072ec496-edad-47a6-bab8-9bbf5145f86f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.380528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw" (OuterVolumeSpecName: "kube-api-access-gnknw") pod "072ec496-edad-47a6-bab8-9bbf5145f86f" (UID: "072ec496-edad-47a6-bab8-9bbf5145f86f"). InnerVolumeSpecName "kube-api-access-gnknw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.454129 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "072ec496-edad-47a6-bab8-9bbf5145f86f" (UID: "072ec496-edad-47a6-bab8-9bbf5145f86f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.471245 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.471285 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/072ec496-edad-47a6-bab8-9bbf5145f86f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.471330 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnknw\" (UniqueName: \"kubernetes.io/projected/072ec496-edad-47a6-bab8-9bbf5145f86f-kube-api-access-gnknw\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.638320 4764 generic.go:334] "Generic (PLEG): container finished" podID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerID="a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740" exitCode=0 Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.638643 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerDied","Data":"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740"} Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.638670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-spws6" event={"ID":"072ec496-edad-47a6-bab8-9bbf5145f86f","Type":"ContainerDied","Data":"22f1220be40aef3da9644910968fc02b68fda893959e44c11c2c04c7508519f1"} Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.638687 4764 scope.go:117] "RemoveContainer" containerID="a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.638829 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-spws6" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.661120 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.668324 4764 scope.go:117] "RemoveContainer" containerID="c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.669673 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-spws6"] Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.696781 4764 scope.go:117] "RemoveContainer" containerID="95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.738398 4764 scope.go:117] "RemoveContainer" containerID="a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740" Dec 04 01:37:34 crc kubenswrapper[4764]: E1204 01:37:34.738841 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740\": container with ID starting with a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740 not found: ID does not exist" containerID="a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.738886 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740"} err="failed to get container status \"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740\": rpc error: code = NotFound desc = could not find container \"a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740\": container with ID starting with a418319bbcf55ba7ea0b59a4563a38c5f158b10ff0c0ef315544772cb195f740 not found: ID does not exist" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.738914 4764 scope.go:117] "RemoveContainer" containerID="c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b" Dec 04 01:37:34 crc kubenswrapper[4764]: E1204 01:37:34.739455 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b\": container with ID starting with c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b not found: ID does not exist" containerID="c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.739508 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b"} err="failed to get container status \"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b\": rpc error: code = NotFound desc = could not find container \"c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b\": container with ID starting with c1888d62a6c3fc3c50e92e02002d38c0defcf2150bc9c92632e6116065f3635b not found: ID does not exist" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.739543 4764 scope.go:117] "RemoveContainer" containerID="95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140" Dec 04 01:37:34 crc kubenswrapper[4764]: E1204 01:37:34.739980 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140\": container with ID starting with 95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140 not found: ID does not exist" containerID="95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140" Dec 04 01:37:34 crc kubenswrapper[4764]: I1204 01:37:34.740014 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140"} err="failed to get container status \"95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140\": rpc error: code = NotFound desc = could not find container \"95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140\": container with ID starting with 95c5bd89940f30c92a0a43c76bfd22a83520a80b663694520f54b6faaec02140 not found: ID does not exist" Dec 04 01:37:36 crc kubenswrapper[4764]: I1204 01:37:36.567890 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" path="/var/lib/kubelet/pods/072ec496-edad-47a6-bab8-9bbf5145f86f/volumes" Dec 04 01:37:40 crc kubenswrapper[4764]: I1204 01:37:40.546461 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:37:40 crc kubenswrapper[4764]: E1204 01:37:40.547182 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:40 crc kubenswrapper[4764]: I1204 01:37:40.847925 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:40 crc kubenswrapper[4764]: I1204 01:37:40.901913 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:41 crc kubenswrapper[4764]: I1204 01:37:41.086601 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:42 crc kubenswrapper[4764]: I1204 01:37:42.741184 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ccgjw" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="registry-server" containerID="cri-o://3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653" gracePeriod=2 Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.245639 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.297116 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content\") pod \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.297308 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities\") pod \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.297366 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8hdj\" (UniqueName: \"kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj\") pod \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\" (UID: \"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61\") " Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.298044 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities" (OuterVolumeSpecName: "utilities") pod "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" (UID: "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.303205 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj" (OuterVolumeSpecName: "kube-api-access-g8hdj") pod "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" (UID: "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61"). InnerVolumeSpecName "kube-api-access-g8hdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.400160 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.400196 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8hdj\" (UniqueName: \"kubernetes.io/projected/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-kube-api-access-g8hdj\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.412638 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" (UID: "ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.503831 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.755343 4764 generic.go:334] "Generic (PLEG): container finished" podID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerID="3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653" exitCode=0 Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.755396 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerDied","Data":"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653"} Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.755428 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ccgjw" event={"ID":"ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61","Type":"ContainerDied","Data":"4ad303dc7dff36ea55f74725023d3221fb5fdcd2da673fc5ac4e3e64c4c8305f"} Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.755455 4764 scope.go:117] "RemoveContainer" containerID="3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.755465 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ccgjw" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.794744 4764 scope.go:117] "RemoveContainer" containerID="e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.798970 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.809150 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ccgjw"] Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.832975 4764 scope.go:117] "RemoveContainer" containerID="73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.873313 4764 scope.go:117] "RemoveContainer" containerID="3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653" Dec 04 01:37:43 crc kubenswrapper[4764]: E1204 01:37:43.873967 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653\": container with ID starting with 3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653 not found: ID does not exist" containerID="3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.874015 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653"} err="failed to get container status \"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653\": rpc error: code = NotFound desc = could not find container \"3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653\": container with ID starting with 3042f44f59aa33bad8af1c40cee06c085a198001d3fa8af8fd76128065570653 not found: ID does not exist" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.874049 4764 scope.go:117] "RemoveContainer" containerID="e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9" Dec 04 01:37:43 crc kubenswrapper[4764]: E1204 01:37:43.874543 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9\": container with ID starting with e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9 not found: ID does not exist" containerID="e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.874597 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9"} err="failed to get container status \"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9\": rpc error: code = NotFound desc = could not find container \"e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9\": container with ID starting with e97a7f71f87691420a88c518f92202b3358b66fa60b5727241a58dc726df1ef9 not found: ID does not exist" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.874635 4764 scope.go:117] "RemoveContainer" containerID="73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6" Dec 04 01:37:43 crc kubenswrapper[4764]: E1204 01:37:43.875112 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6\": container with ID starting with 73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6 not found: ID does not exist" containerID="73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6" Dec 04 01:37:43 crc kubenswrapper[4764]: I1204 01:37:43.875159 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6"} err="failed to get container status \"73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6\": rpc error: code = NotFound desc = could not find container \"73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6\": container with ID starting with 73ea68d2665df96b0b3c8afaf8bea6af78ecd91de5f67abc5439f4f3dc5d5ba6 not found: ID does not exist" Dec 04 01:37:44 crc kubenswrapper[4764]: I1204 01:37:44.567984 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" path="/var/lib/kubelet/pods/ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61/volumes" Dec 04 01:37:53 crc kubenswrapper[4764]: I1204 01:37:53.547382 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:37:53 crc kubenswrapper[4764]: E1204 01:37:53.548430 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:37:55 crc kubenswrapper[4764]: I1204 01:37:55.903441 4764 generic.go:334] "Generic (PLEG): container finished" podID="4336186f-bad0-463f-8133-1f6d260ab27f" containerID="0c23565bd50f1ec54c33b19db3f9c24df1bacbf58098fb72015a7acaeee8e7ee" exitCode=0 Dec 04 01:37:55 crc kubenswrapper[4764]: I1204 01:37:55.903536 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" event={"ID":"4336186f-bad0-463f-8133-1f6d260ab27f","Type":"ContainerDied","Data":"0c23565bd50f1ec54c33b19db3f9c24df1bacbf58098fb72015a7acaeee8e7ee"} Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.522090 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.641419 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle\") pod \"4336186f-bad0-463f-8133-1f6d260ab27f\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.641531 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvp5p\" (UniqueName: \"kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p\") pod \"4336186f-bad0-463f-8133-1f6d260ab27f\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.641634 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph\") pod \"4336186f-bad0-463f-8133-1f6d260ab27f\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.641669 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory\") pod \"4336186f-bad0-463f-8133-1f6d260ab27f\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.641741 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key\") pod \"4336186f-bad0-463f-8133-1f6d260ab27f\" (UID: \"4336186f-bad0-463f-8133-1f6d260ab27f\") " Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.647256 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph" (OuterVolumeSpecName: "ceph") pod "4336186f-bad0-463f-8133-1f6d260ab27f" (UID: "4336186f-bad0-463f-8133-1f6d260ab27f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.649457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p" (OuterVolumeSpecName: "kube-api-access-hvp5p") pod "4336186f-bad0-463f-8133-1f6d260ab27f" (UID: "4336186f-bad0-463f-8133-1f6d260ab27f"). InnerVolumeSpecName "kube-api-access-hvp5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.652566 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "4336186f-bad0-463f-8133-1f6d260ab27f" (UID: "4336186f-bad0-463f-8133-1f6d260ab27f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.675027 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory" (OuterVolumeSpecName: "inventory") pod "4336186f-bad0-463f-8133-1f6d260ab27f" (UID: "4336186f-bad0-463f-8133-1f6d260ab27f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.676528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4336186f-bad0-463f-8133-1f6d260ab27f" (UID: "4336186f-bad0-463f-8133-1f6d260ab27f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.744911 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.744941 4764 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.744952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvp5p\" (UniqueName: \"kubernetes.io/projected/4336186f-bad0-463f-8133-1f6d260ab27f-kube-api-access-hvp5p\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.744963 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.744974 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4336186f-bad0-463f-8133-1f6d260ab27f-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.932640 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" event={"ID":"4336186f-bad0-463f-8133-1f6d260ab27f","Type":"ContainerDied","Data":"a5f0756e03584d093c773ab9521123641137d2e50106ca02f2448c72a1aac36b"} Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.932693 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw" Dec 04 01:37:57 crc kubenswrapper[4764]: I1204 01:37:57.932705 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5f0756e03584d093c773ab9521123641137d2e50106ca02f2448c72a1aac36b" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.842061 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5h7lq"] Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843031 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4336186f-bad0-463f-8133-1f6d260ab27f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843049 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4336186f-bad0-463f-8133-1f6d260ab27f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843073 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="extract-utilities" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843086 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="extract-utilities" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843110 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="extract-content" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843118 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="extract-content" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843134 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843143 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843166 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="extract-utilities" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843176 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="extract-utilities" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843198 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="extract-content" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843205 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="extract-content" Dec 04 01:38:00 crc kubenswrapper[4764]: E1204 01:38:00.843221 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843229 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843500 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="072ec496-edad-47a6-bab8-9bbf5145f86f" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843521 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4336186f-bad0-463f-8133-1f6d260ab27f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.843532 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2de20e-84b2-4d8c-8fcb-35aa8c1afa61" containerName="registry-server" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.844574 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.848630 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.848630 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.848963 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.855997 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5h7lq"] Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.856234 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.922858 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvnn\" (UniqueName: \"kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.922962 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.923083 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.923152 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:00 crc kubenswrapper[4764]: I1204 01:38:00.923419 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.024781 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.024897 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.024935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvnn\" (UniqueName: \"kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.024967 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.025025 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.031192 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.031202 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.031919 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.032370 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.041672 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvnn\" (UniqueName: \"kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn\") pod \"bootstrap-openstack-openstack-cell1-5h7lq\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.171403 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.824635 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.827421 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-5h7lq"] Dec 04 01:38:01 crc kubenswrapper[4764]: I1204 01:38:01.994754 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" event={"ID":"29cfe6b3-652e-48f3-974f-4b3cbe3815d4","Type":"ContainerStarted","Data":"1199253b5098e877329194764366667458f2504b9d14ca0091e16eaa589c2198"} Dec 04 01:38:03 crc kubenswrapper[4764]: I1204 01:38:03.006120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" event={"ID":"29cfe6b3-652e-48f3-974f-4b3cbe3815d4","Type":"ContainerStarted","Data":"d712b5836cb9e67da6c065bebbf836feeed8d15a611df1e548a9f581d83b8391"} Dec 04 01:38:07 crc kubenswrapper[4764]: I1204 01:38:07.567891 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:38:07 crc kubenswrapper[4764]: E1204 01:38:07.568781 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:38:21 crc kubenswrapper[4764]: I1204 01:38:21.547997 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:38:22 crc kubenswrapper[4764]: I1204 01:38:22.216965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5"} Dec 04 01:38:22 crc kubenswrapper[4764]: I1204 01:38:22.234423 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" podStartSLOduration=21.718299686 podStartE2EDuration="22.23440912s" podCreationTimestamp="2025-12-04 01:38:00 +0000 UTC" firstStartedPulling="2025-12-04 01:38:01.824446392 +0000 UTC m=+7017.585770793" lastFinishedPulling="2025-12-04 01:38:02.340555786 +0000 UTC m=+7018.101880227" observedRunningTime="2025-12-04 01:38:03.042398401 +0000 UTC m=+7018.803722872" watchObservedRunningTime="2025-12-04 01:38:22.23440912 +0000 UTC m=+7037.995733531" Dec 04 01:40:46 crc kubenswrapper[4764]: I1204 01:40:46.977823 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:40:46 crc kubenswrapper[4764]: I1204 01:40:46.982373 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.006232 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.105949 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6g2l\" (UniqueName: \"kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.106219 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.106268 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.208680 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6g2l\" (UniqueName: \"kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.208927 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.208969 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.209434 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.209496 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.229868 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6g2l\" (UniqueName: \"kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l\") pod \"community-operators-29bb6\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.307312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:47 crc kubenswrapper[4764]: I1204 01:40:47.974686 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:40:48 crc kubenswrapper[4764]: I1204 01:40:48.925847 4764 generic.go:334] "Generic (PLEG): container finished" podID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerID="c9efa9f5288b31f38f2f4d751f7774ba6feb295d70b057cf54a86f14be36adc3" exitCode=0 Dec 04 01:40:48 crc kubenswrapper[4764]: I1204 01:40:48.925923 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerDied","Data":"c9efa9f5288b31f38f2f4d751f7774ba6feb295d70b057cf54a86f14be36adc3"} Dec 04 01:40:48 crc kubenswrapper[4764]: I1204 01:40:48.926102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerStarted","Data":"b377746b8323a19a7f4d12e1a356e6ee430700fc05338d74e370a79a7dc64e30"} Dec 04 01:40:49 crc kubenswrapper[4764]: I1204 01:40:49.937801 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerStarted","Data":"c4ce06187ffc50809a1fbdb3115b5f2a49f5ca153839b51443cd5e7a2a3cde5b"} Dec 04 01:40:50 crc kubenswrapper[4764]: I1204 01:40:50.869577 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:40:50 crc kubenswrapper[4764]: I1204 01:40:50.869935 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:40:50 crc kubenswrapper[4764]: I1204 01:40:50.954787 4764 generic.go:334] "Generic (PLEG): container finished" podID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerID="c4ce06187ffc50809a1fbdb3115b5f2a49f5ca153839b51443cd5e7a2a3cde5b" exitCode=0 Dec 04 01:40:50 crc kubenswrapper[4764]: I1204 01:40:50.954837 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerDied","Data":"c4ce06187ffc50809a1fbdb3115b5f2a49f5ca153839b51443cd5e7a2a3cde5b"} Dec 04 01:40:51 crc kubenswrapper[4764]: I1204 01:40:51.968303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerStarted","Data":"72e99312ac930679368e7b975d1a173a821a1f30420a9014c72fb86f2d9ae624"} Dec 04 01:40:51 crc kubenswrapper[4764]: I1204 01:40:51.996366 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-29bb6" podStartSLOduration=3.526162957 podStartE2EDuration="5.996347699s" podCreationTimestamp="2025-12-04 01:40:46 +0000 UTC" firstStartedPulling="2025-12-04 01:40:48.92824326 +0000 UTC m=+7184.689567661" lastFinishedPulling="2025-12-04 01:40:51.398427992 +0000 UTC m=+7187.159752403" observedRunningTime="2025-12-04 01:40:51.990980077 +0000 UTC m=+7187.752304488" watchObservedRunningTime="2025-12-04 01:40:51.996347699 +0000 UTC m=+7187.757672110" Dec 04 01:40:57 crc kubenswrapper[4764]: I1204 01:40:57.307726 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:57 crc kubenswrapper[4764]: I1204 01:40:57.308296 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:57 crc kubenswrapper[4764]: I1204 01:40:57.366007 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:58 crc kubenswrapper[4764]: I1204 01:40:58.093020 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:40:58 crc kubenswrapper[4764]: I1204 01:40:58.169654 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:41:00 crc kubenswrapper[4764]: I1204 01:41:00.053057 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-29bb6" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="registry-server" containerID="cri-o://72e99312ac930679368e7b975d1a173a821a1f30420a9014c72fb86f2d9ae624" gracePeriod=2 Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.073120 4764 generic.go:334] "Generic (PLEG): container finished" podID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerID="72e99312ac930679368e7b975d1a173a821a1f30420a9014c72fb86f2d9ae624" exitCode=0 Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.073199 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerDied","Data":"72e99312ac930679368e7b975d1a173a821a1f30420a9014c72fb86f2d9ae624"} Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.073909 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-29bb6" event={"ID":"f1cb3f9b-22a8-41f5-9942-5619a049754b","Type":"ContainerDied","Data":"b377746b8323a19a7f4d12e1a356e6ee430700fc05338d74e370a79a7dc64e30"} Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.073939 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b377746b8323a19a7f4d12e1a356e6ee430700fc05338d74e370a79a7dc64e30" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.096593 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.245702 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content\") pod \"f1cb3f9b-22a8-41f5-9942-5619a049754b\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.245913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities\") pod \"f1cb3f9b-22a8-41f5-9942-5619a049754b\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.245950 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6g2l\" (UniqueName: \"kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l\") pod \"f1cb3f9b-22a8-41f5-9942-5619a049754b\" (UID: \"f1cb3f9b-22a8-41f5-9942-5619a049754b\") " Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.247398 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities" (OuterVolumeSpecName: "utilities") pod "f1cb3f9b-22a8-41f5-9942-5619a049754b" (UID: "f1cb3f9b-22a8-41f5-9942-5619a049754b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.253004 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l" (OuterVolumeSpecName: "kube-api-access-l6g2l") pod "f1cb3f9b-22a8-41f5-9942-5619a049754b" (UID: "f1cb3f9b-22a8-41f5-9942-5619a049754b"). InnerVolumeSpecName "kube-api-access-l6g2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.299214 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1cb3f9b-22a8-41f5-9942-5619a049754b" (UID: "f1cb3f9b-22a8-41f5-9942-5619a049754b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.349000 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.349307 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6g2l\" (UniqueName: \"kubernetes.io/projected/f1cb3f9b-22a8-41f5-9942-5619a049754b-kube-api-access-l6g2l\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:01 crc kubenswrapper[4764]: I1204 01:41:01.349326 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1cb3f9b-22a8-41f5-9942-5619a049754b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:02 crc kubenswrapper[4764]: I1204 01:41:02.087569 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-29bb6" Dec 04 01:41:02 crc kubenswrapper[4764]: I1204 01:41:02.161361 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:41:02 crc kubenswrapper[4764]: I1204 01:41:02.195692 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-29bb6"] Dec 04 01:41:02 crc kubenswrapper[4764]: I1204 01:41:02.560257 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" path="/var/lib/kubelet/pods/f1cb3f9b-22a8-41f5-9942-5619a049754b/volumes" Dec 04 01:41:16 crc kubenswrapper[4764]: I1204 01:41:16.255760 4764 generic.go:334] "Generic (PLEG): container finished" podID="29cfe6b3-652e-48f3-974f-4b3cbe3815d4" containerID="d712b5836cb9e67da6c065bebbf836feeed8d15a611df1e548a9f581d83b8391" exitCode=0 Dec 04 01:41:16 crc kubenswrapper[4764]: I1204 01:41:16.256464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" event={"ID":"29cfe6b3-652e-48f3-974f-4b3cbe3815d4","Type":"ContainerDied","Data":"d712b5836cb9e67da6c065bebbf836feeed8d15a611df1e548a9f581d83b8391"} Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.746195 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.868530 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key\") pod \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.868585 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle\") pod \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.868660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory\") pod \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.868731 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph\") pod \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.868798 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzvnn\" (UniqueName: \"kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn\") pod \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\" (UID: \"29cfe6b3-652e-48f3-974f-4b3cbe3815d4\") " Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.874608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "29cfe6b3-652e-48f3-974f-4b3cbe3815d4" (UID: "29cfe6b3-652e-48f3-974f-4b3cbe3815d4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.875851 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn" (OuterVolumeSpecName: "kube-api-access-hzvnn") pod "29cfe6b3-652e-48f3-974f-4b3cbe3815d4" (UID: "29cfe6b3-652e-48f3-974f-4b3cbe3815d4"). InnerVolumeSpecName "kube-api-access-hzvnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.876041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph" (OuterVolumeSpecName: "ceph") pod "29cfe6b3-652e-48f3-974f-4b3cbe3815d4" (UID: "29cfe6b3-652e-48f3-974f-4b3cbe3815d4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.912742 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29cfe6b3-652e-48f3-974f-4b3cbe3815d4" (UID: "29cfe6b3-652e-48f3-974f-4b3cbe3815d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.914414 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory" (OuterVolumeSpecName: "inventory") pod "29cfe6b3-652e-48f3-974f-4b3cbe3815d4" (UID: "29cfe6b3-652e-48f3-974f-4b3cbe3815d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.971317 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.971648 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.971782 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.971864 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:17 crc kubenswrapper[4764]: I1204 01:41:17.972006 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzvnn\" (UniqueName: \"kubernetes.io/projected/29cfe6b3-652e-48f3-974f-4b3cbe3815d4-kube-api-access-hzvnn\") on node \"crc\" DevicePath \"\"" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.322613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" event={"ID":"29cfe6b3-652e-48f3-974f-4b3cbe3815d4","Type":"ContainerDied","Data":"1199253b5098e877329194764366667458f2504b9d14ca0091e16eaa589c2198"} Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.322655 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1199253b5098e877329194764366667458f2504b9d14ca0091e16eaa589c2198" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.322800 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-5h7lq" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.430862 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mrftr"] Dec 04 01:41:18 crc kubenswrapper[4764]: E1204 01:41:18.431450 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="extract-content" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431473 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="extract-content" Dec 04 01:41:18 crc kubenswrapper[4764]: E1204 01:41:18.431482 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="registry-server" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431491 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="registry-server" Dec 04 01:41:18 crc kubenswrapper[4764]: E1204 01:41:18.431515 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29cfe6b3-652e-48f3-974f-4b3cbe3815d4" containerName="bootstrap-openstack-openstack-cell1" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431526 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="29cfe6b3-652e-48f3-974f-4b3cbe3815d4" containerName="bootstrap-openstack-openstack-cell1" Dec 04 01:41:18 crc kubenswrapper[4764]: E1204 01:41:18.431546 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="extract-utilities" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431555 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="extract-utilities" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431852 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="29cfe6b3-652e-48f3-974f-4b3cbe3815d4" containerName="bootstrap-openstack-openstack-cell1" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.431888 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1cb3f9b-22a8-41f5-9942-5619a049754b" containerName="registry-server" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.432877 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.435617 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.435660 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.437097 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.437202 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.447422 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mrftr"] Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.584045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.584092 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.584119 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.584348 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snjpw\" (UniqueName: \"kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.688741 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.688824 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.688855 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.689058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snjpw\" (UniqueName: \"kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.694189 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.695121 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.695225 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.707936 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snjpw\" (UniqueName: \"kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw\") pod \"download-cache-openstack-openstack-cell1-mrftr\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:18 crc kubenswrapper[4764]: I1204 01:41:18.750006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:41:19 crc kubenswrapper[4764]: I1204 01:41:19.285982 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-mrftr"] Dec 04 01:41:19 crc kubenswrapper[4764]: I1204 01:41:19.333094 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" event={"ID":"1b9e5f50-30a4-4d90-bc88-4971bcc8740a","Type":"ContainerStarted","Data":"1a3e10488935f2e6f07d053cdd149925c069dd371edae4fd8d7133f4ac34c775"} Dec 04 01:41:20 crc kubenswrapper[4764]: I1204 01:41:20.359049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" event={"ID":"1b9e5f50-30a4-4d90-bc88-4971bcc8740a","Type":"ContainerStarted","Data":"608a9d9c7574d94a83ef0fcb4d39c4d99c9a4856caf6486cf871cc639650f567"} Dec 04 01:41:20 crc kubenswrapper[4764]: I1204 01:41:20.391824 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" podStartSLOduration=1.88544883 podStartE2EDuration="2.391792373s" podCreationTimestamp="2025-12-04 01:41:18 +0000 UTC" firstStartedPulling="2025-12-04 01:41:19.289084692 +0000 UTC m=+7215.050409103" lastFinishedPulling="2025-12-04 01:41:19.795428225 +0000 UTC m=+7215.556752646" observedRunningTime="2025-12-04 01:41:20.386889673 +0000 UTC m=+7216.148214114" watchObservedRunningTime="2025-12-04 01:41:20.391792373 +0000 UTC m=+7216.153116844" Dec 04 01:41:20 crc kubenswrapper[4764]: I1204 01:41:20.869182 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:41:20 crc kubenswrapper[4764]: I1204 01:41:20.869472 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:41:50 crc kubenswrapper[4764]: I1204 01:41:50.868868 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:41:50 crc kubenswrapper[4764]: I1204 01:41:50.869522 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:41:50 crc kubenswrapper[4764]: I1204 01:41:50.869573 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:41:50 crc kubenswrapper[4764]: I1204 01:41:50.870571 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:41:50 crc kubenswrapper[4764]: I1204 01:41:50.870642 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5" gracePeriod=600 Dec 04 01:41:51 crc kubenswrapper[4764]: I1204 01:41:51.751008 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5" exitCode=0 Dec 04 01:41:51 crc kubenswrapper[4764]: I1204 01:41:51.751102 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5"} Dec 04 01:41:51 crc kubenswrapper[4764]: I1204 01:41:51.751471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c"} Dec 04 01:41:51 crc kubenswrapper[4764]: I1204 01:41:51.751489 4764 scope.go:117] "RemoveContainer" containerID="7df0e71adfc6c0f7fa7d5050d37ce5d5cfb3c9e5c8822528571cae79fbc1bc70" Dec 04 01:42:52 crc kubenswrapper[4764]: I1204 01:42:52.525231 4764 generic.go:334] "Generic (PLEG): container finished" podID="1b9e5f50-30a4-4d90-bc88-4971bcc8740a" containerID="608a9d9c7574d94a83ef0fcb4d39c4d99c9a4856caf6486cf871cc639650f567" exitCode=0 Dec 04 01:42:52 crc kubenswrapper[4764]: I1204 01:42:52.525321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" event={"ID":"1b9e5f50-30a4-4d90-bc88-4971bcc8740a","Type":"ContainerDied","Data":"608a9d9c7574d94a83ef0fcb4d39c4d99c9a4856caf6486cf871cc639650f567"} Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.066240 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.222602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snjpw\" (UniqueName: \"kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw\") pod \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.222847 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory\") pod \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.222936 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph\") pod \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.222981 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key\") pod \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\" (UID: \"1b9e5f50-30a4-4d90-bc88-4971bcc8740a\") " Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.239204 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw" (OuterVolumeSpecName: "kube-api-access-snjpw") pod "1b9e5f50-30a4-4d90-bc88-4971bcc8740a" (UID: "1b9e5f50-30a4-4d90-bc88-4971bcc8740a"). InnerVolumeSpecName "kube-api-access-snjpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.243857 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph" (OuterVolumeSpecName: "ceph") pod "1b9e5f50-30a4-4d90-bc88-4971bcc8740a" (UID: "1b9e5f50-30a4-4d90-bc88-4971bcc8740a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.275319 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b9e5f50-30a4-4d90-bc88-4971bcc8740a" (UID: "1b9e5f50-30a4-4d90-bc88-4971bcc8740a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.296984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory" (OuterVolumeSpecName: "inventory") pod "1b9e5f50-30a4-4d90-bc88-4971bcc8740a" (UID: "1b9e5f50-30a4-4d90-bc88-4971bcc8740a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.326606 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snjpw\" (UniqueName: \"kubernetes.io/projected/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-kube-api-access-snjpw\") on node \"crc\" DevicePath \"\"" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.326637 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.326659 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.326668 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9e5f50-30a4-4d90-bc88-4971bcc8740a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.584272 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.585387 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-mrftr" event={"ID":"1b9e5f50-30a4-4d90-bc88-4971bcc8740a","Type":"ContainerDied","Data":"1a3e10488935f2e6f07d053cdd149925c069dd371edae4fd8d7133f4ac34c775"} Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.585421 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a3e10488935f2e6f07d053cdd149925c069dd371edae4fd8d7133f4ac34c775" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.649810 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lctlv"] Dec 04 01:42:54 crc kubenswrapper[4764]: E1204 01:42:54.650377 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9e5f50-30a4-4d90-bc88-4971bcc8740a" containerName="download-cache-openstack-openstack-cell1" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.650404 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9e5f50-30a4-4d90-bc88-4971bcc8740a" containerName="download-cache-openstack-openstack-cell1" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.650677 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9e5f50-30a4-4d90-bc88-4971bcc8740a" containerName="download-cache-openstack-openstack-cell1" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.651689 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.656330 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.656369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.656429 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.656521 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.664536 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lctlv"] Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.735341 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.735379 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.735416 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.735576 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcwb\" (UniqueName: \"kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.837650 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.837760 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.837806 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.838087 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcwb\" (UniqueName: \"kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.842512 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.842667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.843076 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.856461 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcwb\" (UniqueName: \"kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb\") pod \"configure-network-openstack-openstack-cell1-lctlv\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:54 crc kubenswrapper[4764]: I1204 01:42:54.979664 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:42:55 crc kubenswrapper[4764]: I1204 01:42:55.579587 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-lctlv"] Dec 04 01:42:56 crc kubenswrapper[4764]: I1204 01:42:56.573546 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" event={"ID":"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb","Type":"ContainerStarted","Data":"5dd72b4497541f492eb701e989e827c19d007fa93bb41cfbebbdc5c5ee0bf71f"} Dec 04 01:42:56 crc kubenswrapper[4764]: I1204 01:42:56.574028 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" event={"ID":"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb","Type":"ContainerStarted","Data":"08ab418bdfcea92d96aa6b9207aac33b61663d4a1e9565890d65026500dceb90"} Dec 04 01:42:56 crc kubenswrapper[4764]: I1204 01:42:56.605898 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" podStartSLOduration=2.081998896 podStartE2EDuration="2.605880231s" podCreationTimestamp="2025-12-04 01:42:54 +0000 UTC" firstStartedPulling="2025-12-04 01:42:55.58977462 +0000 UTC m=+7311.351099031" lastFinishedPulling="2025-12-04 01:42:56.113655945 +0000 UTC m=+7311.874980366" observedRunningTime="2025-12-04 01:42:56.602069897 +0000 UTC m=+7312.363394338" watchObservedRunningTime="2025-12-04 01:42:56.605880231 +0000 UTC m=+7312.367204642" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.151855 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.154820 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.175358 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.229093 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rrn\" (UniqueName: \"kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.229334 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.229616 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.331746 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.331935 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.332058 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8rrn\" (UniqueName: \"kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.332291 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.332341 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.351213 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8rrn\" (UniqueName: \"kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn\") pod \"redhat-marketplace-99cxw\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.484602 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:21 crc kubenswrapper[4764]: I1204 01:43:21.971379 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:22 crc kubenswrapper[4764]: I1204 01:43:22.922306 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerID="de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333" exitCode=0 Dec 04 01:43:22 crc kubenswrapper[4764]: I1204 01:43:22.922376 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerDied","Data":"de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333"} Dec 04 01:43:22 crc kubenswrapper[4764]: I1204 01:43:22.922625 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerStarted","Data":"0bc68e78634309e3047e8a1e315e30a0887d5b0d394eaea4abb350fa54b2b7d0"} Dec 04 01:43:22 crc kubenswrapper[4764]: I1204 01:43:22.925818 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:43:24 crc kubenswrapper[4764]: I1204 01:43:24.944105 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerID="ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6" exitCode=0 Dec 04 01:43:24 crc kubenswrapper[4764]: I1204 01:43:24.944591 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerDied","Data":"ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6"} Dec 04 01:43:25 crc kubenswrapper[4764]: I1204 01:43:25.957091 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerStarted","Data":"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a"} Dec 04 01:43:25 crc kubenswrapper[4764]: I1204 01:43:25.982488 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-99cxw" podStartSLOduration=2.486069067 podStartE2EDuration="4.982470635s" podCreationTimestamp="2025-12-04 01:43:21 +0000 UTC" firstStartedPulling="2025-12-04 01:43:22.925362346 +0000 UTC m=+7338.686686797" lastFinishedPulling="2025-12-04 01:43:25.421763954 +0000 UTC m=+7341.183088365" observedRunningTime="2025-12-04 01:43:25.975637246 +0000 UTC m=+7341.736961667" watchObservedRunningTime="2025-12-04 01:43:25.982470635 +0000 UTC m=+7341.743795046" Dec 04 01:43:31 crc kubenswrapper[4764]: I1204 01:43:31.485526 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:31 crc kubenswrapper[4764]: I1204 01:43:31.486158 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:31 crc kubenswrapper[4764]: I1204 01:43:31.563283 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:32 crc kubenswrapper[4764]: I1204 01:43:32.107627 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:32 crc kubenswrapper[4764]: I1204 01:43:32.184701 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.054072 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-99cxw" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="registry-server" containerID="cri-o://f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a" gracePeriod=2 Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.605920 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.666772 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8rrn\" (UniqueName: \"kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn\") pod \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.666869 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content\") pod \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.666917 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities\") pod \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\" (UID: \"e6095137-5bcc-4fb5-8d62-083c88c5ae9e\") " Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.668164 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities" (OuterVolumeSpecName: "utilities") pod "e6095137-5bcc-4fb5-8d62-083c88c5ae9e" (UID: "e6095137-5bcc-4fb5-8d62-083c88c5ae9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.669113 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.673983 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn" (OuterVolumeSpecName: "kube-api-access-h8rrn") pod "e6095137-5bcc-4fb5-8d62-083c88c5ae9e" (UID: "e6095137-5bcc-4fb5-8d62-083c88c5ae9e"). InnerVolumeSpecName "kube-api-access-h8rrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.697998 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6095137-5bcc-4fb5-8d62-083c88c5ae9e" (UID: "e6095137-5bcc-4fb5-8d62-083c88c5ae9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.771007 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8rrn\" (UniqueName: \"kubernetes.io/projected/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-kube-api-access-h8rrn\") on node \"crc\" DevicePath \"\"" Dec 04 01:43:34 crc kubenswrapper[4764]: I1204 01:43:34.771040 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6095137-5bcc-4fb5-8d62-083c88c5ae9e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.070555 4764 generic.go:334] "Generic (PLEG): container finished" podID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerID="f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a" exitCode=0 Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.070615 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-99cxw" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.070639 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerDied","Data":"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a"} Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.071188 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-99cxw" event={"ID":"e6095137-5bcc-4fb5-8d62-083c88c5ae9e","Type":"ContainerDied","Data":"0bc68e78634309e3047e8a1e315e30a0887d5b0d394eaea4abb350fa54b2b7d0"} Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.071213 4764 scope.go:117] "RemoveContainer" containerID="f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.101795 4764 scope.go:117] "RemoveContainer" containerID="ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.129191 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.140342 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-99cxw"] Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.151539 4764 scope.go:117] "RemoveContainer" containerID="de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.199020 4764 scope.go:117] "RemoveContainer" containerID="f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a" Dec 04 01:43:35 crc kubenswrapper[4764]: E1204 01:43:35.199915 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a\": container with ID starting with f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a not found: ID does not exist" containerID="f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.201242 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a"} err="failed to get container status \"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a\": rpc error: code = NotFound desc = could not find container \"f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a\": container with ID starting with f527dfe89242061f6fa81e3d082ed2cc03b07abcc5941d308aeb458295451e8a not found: ID does not exist" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.201521 4764 scope.go:117] "RemoveContainer" containerID="ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6" Dec 04 01:43:35 crc kubenswrapper[4764]: E1204 01:43:35.207300 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6\": container with ID starting with ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6 not found: ID does not exist" containerID="ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.207377 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6"} err="failed to get container status \"ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6\": rpc error: code = NotFound desc = could not find container \"ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6\": container with ID starting with ca170592514a1cc76adc0fc1d6a4d037a72e17a89996675b6e3be180f88d4fc6 not found: ID does not exist" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.207410 4764 scope.go:117] "RemoveContainer" containerID="de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333" Dec 04 01:43:35 crc kubenswrapper[4764]: E1204 01:43:35.211145 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333\": container with ID starting with de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333 not found: ID does not exist" containerID="de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333" Dec 04 01:43:35 crc kubenswrapper[4764]: I1204 01:43:35.211204 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333"} err="failed to get container status \"de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333\": rpc error: code = NotFound desc = could not find container \"de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333\": container with ID starting with de45fa90bc70d374312ad54a0d05513d5844c29f812cbcf7f8214f6572a47333 not found: ID does not exist" Dec 04 01:43:36 crc kubenswrapper[4764]: I1204 01:43:36.563559 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" path="/var/lib/kubelet/pods/e6095137-5bcc-4fb5-8d62-083c88c5ae9e/volumes" Dec 04 01:44:20 crc kubenswrapper[4764]: I1204 01:44:20.869081 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:44:20 crc kubenswrapper[4764]: I1204 01:44:20.869676 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:44:23 crc kubenswrapper[4764]: I1204 01:44:23.684907 4764 generic.go:334] "Generic (PLEG): container finished" podID="a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" containerID="5dd72b4497541f492eb701e989e827c19d007fa93bb41cfbebbdc5c5ee0bf71f" exitCode=0 Dec 04 01:44:23 crc kubenswrapper[4764]: I1204 01:44:23.685004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" event={"ID":"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb","Type":"ContainerDied","Data":"5dd72b4497541f492eb701e989e827c19d007fa93bb41cfbebbdc5c5ee0bf71f"} Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.206832 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.282660 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcwb\" (UniqueName: \"kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb\") pod \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.283025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph\") pod \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.283133 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory\") pod \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.283281 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key\") pod \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\" (UID: \"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb\") " Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.287914 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph" (OuterVolumeSpecName: "ceph") pod "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" (UID: "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.295874 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb" (OuterVolumeSpecName: "kube-api-access-kpcwb") pod "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" (UID: "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb"). InnerVolumeSpecName "kube-api-access-kpcwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.311774 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" (UID: "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.320462 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory" (OuterVolumeSpecName: "inventory") pod "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" (UID: "a9087c3e-7d6e-4590-a64c-3f9b5d3826fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.385943 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpcwb\" (UniqueName: \"kubernetes.io/projected/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-kube-api-access-kpcwb\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.386148 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.386271 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.386418 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a9087c3e-7d6e-4590-a64c-3f9b5d3826fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.713033 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" event={"ID":"a9087c3e-7d6e-4590-a64c-3f9b5d3826fb","Type":"ContainerDied","Data":"08ab418bdfcea92d96aa6b9207aac33b61663d4a1e9565890d65026500dceb90"} Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.713095 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ab418bdfcea92d96aa6b9207aac33b61663d4a1e9565890d65026500dceb90" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.713150 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-lctlv" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.868220 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-m4rkn"] Dec 04 01:44:25 crc kubenswrapper[4764]: E1204 01:44:25.868849 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="extract-content" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.868864 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="extract-content" Dec 04 01:44:25 crc kubenswrapper[4764]: E1204 01:44:25.868894 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" containerName="configure-network-openstack-openstack-cell1" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.868901 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" containerName="configure-network-openstack-openstack-cell1" Dec 04 01:44:25 crc kubenswrapper[4764]: E1204 01:44:25.868921 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="registry-server" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.868927 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="registry-server" Dec 04 01:44:25 crc kubenswrapper[4764]: E1204 01:44:25.868949 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="extract-utilities" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.868956 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="extract-utilities" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.869141 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6095137-5bcc-4fb5-8d62-083c88c5ae9e" containerName="registry-server" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.869154 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9087c3e-7d6e-4590-a64c-3f9b5d3826fb" containerName="configure-network-openstack-openstack-cell1" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.869929 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.881072 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.881123 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.881499 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-m4rkn"] Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.889243 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:44:25 crc kubenswrapper[4764]: I1204 01:44:25.889612 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.004644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.004728 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm5xf\" (UniqueName: \"kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.004910 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.005005 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.106914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.107006 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.108041 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.108121 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm5xf\" (UniqueName: \"kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.113813 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.115535 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.127296 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.127862 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm5xf\" (UniqueName: \"kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf\") pod \"validate-network-openstack-openstack-cell1-m4rkn\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.250130 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:26 crc kubenswrapper[4764]: I1204 01:44:26.866733 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-m4rkn"] Dec 04 01:44:27 crc kubenswrapper[4764]: I1204 01:44:27.732060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" event={"ID":"2bbe80b3-f8b0-4197-9836-34847231fe93","Type":"ContainerStarted","Data":"a81a9bd3bdddb5d3ba481946e133fc63b506d9362756c9b25ac13485ea4e6950"} Dec 04 01:44:27 crc kubenswrapper[4764]: I1204 01:44:27.732683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" event={"ID":"2bbe80b3-f8b0-4197-9836-34847231fe93","Type":"ContainerStarted","Data":"e3459906c1186307bc11728d5b568ba523e4dabd135c134db3315b6769255dda"} Dec 04 01:44:27 crc kubenswrapper[4764]: I1204 01:44:27.758493 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" podStartSLOduration=2.240816508 podStartE2EDuration="2.75846776s" podCreationTimestamp="2025-12-04 01:44:25 +0000 UTC" firstStartedPulling="2025-12-04 01:44:26.874236205 +0000 UTC m=+7402.635560616" lastFinishedPulling="2025-12-04 01:44:27.391887457 +0000 UTC m=+7403.153211868" observedRunningTime="2025-12-04 01:44:27.755160209 +0000 UTC m=+7403.516484630" watchObservedRunningTime="2025-12-04 01:44:27.75846776 +0000 UTC m=+7403.519792201" Dec 04 01:44:32 crc kubenswrapper[4764]: I1204 01:44:32.786257 4764 generic.go:334] "Generic (PLEG): container finished" podID="2bbe80b3-f8b0-4197-9836-34847231fe93" containerID="a81a9bd3bdddb5d3ba481946e133fc63b506d9362756c9b25ac13485ea4e6950" exitCode=0 Dec 04 01:44:32 crc kubenswrapper[4764]: I1204 01:44:32.786357 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" event={"ID":"2bbe80b3-f8b0-4197-9836-34847231fe93","Type":"ContainerDied","Data":"a81a9bd3bdddb5d3ba481946e133fc63b506d9362756c9b25ac13485ea4e6950"} Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.260164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.410574 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory\") pod \"2bbe80b3-f8b0-4197-9836-34847231fe93\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.410631 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph\") pod \"2bbe80b3-f8b0-4197-9836-34847231fe93\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.411077 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key\") pod \"2bbe80b3-f8b0-4197-9836-34847231fe93\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.411155 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm5xf\" (UniqueName: \"kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf\") pod \"2bbe80b3-f8b0-4197-9836-34847231fe93\" (UID: \"2bbe80b3-f8b0-4197-9836-34847231fe93\") " Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.419348 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf" (OuterVolumeSpecName: "kube-api-access-cm5xf") pod "2bbe80b3-f8b0-4197-9836-34847231fe93" (UID: "2bbe80b3-f8b0-4197-9836-34847231fe93"). InnerVolumeSpecName "kube-api-access-cm5xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.419547 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph" (OuterVolumeSpecName: "ceph") pod "2bbe80b3-f8b0-4197-9836-34847231fe93" (UID: "2bbe80b3-f8b0-4197-9836-34847231fe93"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.440100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory" (OuterVolumeSpecName: "inventory") pod "2bbe80b3-f8b0-4197-9836-34847231fe93" (UID: "2bbe80b3-f8b0-4197-9836-34847231fe93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.449528 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2bbe80b3-f8b0-4197-9836-34847231fe93" (UID: "2bbe80b3-f8b0-4197-9836-34847231fe93"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.514386 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.514433 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm5xf\" (UniqueName: \"kubernetes.io/projected/2bbe80b3-f8b0-4197-9836-34847231fe93-kube-api-access-cm5xf\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.514450 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.514465 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2bbe80b3-f8b0-4197-9836-34847231fe93-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.814623 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" event={"ID":"2bbe80b3-f8b0-4197-9836-34847231fe93","Type":"ContainerDied","Data":"e3459906c1186307bc11728d5b568ba523e4dabd135c134db3315b6769255dda"} Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.814686 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3459906c1186307bc11728d5b568ba523e4dabd135c134db3315b6769255dda" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.814782 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-m4rkn" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.882751 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-nfbh6"] Dec 04 01:44:34 crc kubenswrapper[4764]: E1204 01:44:34.883576 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bbe80b3-f8b0-4197-9836-34847231fe93" containerName="validate-network-openstack-openstack-cell1" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.883594 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bbe80b3-f8b0-4197-9836-34847231fe93" containerName="validate-network-openstack-openstack-cell1" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.883860 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bbe80b3-f8b0-4197-9836-34847231fe93" containerName="validate-network-openstack-openstack-cell1" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.884838 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.887594 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.887767 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.887974 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.888220 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:44:34 crc kubenswrapper[4764]: I1204 01:44:34.896763 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-nfbh6"] Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.031488 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.031638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.031906 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.032294 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9jmx\" (UniqueName: \"kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.134411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.134518 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9jmx\" (UniqueName: \"kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.134592 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.134630 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.139785 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.140507 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.146894 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.173377 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9jmx\" (UniqueName: \"kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx\") pod \"install-os-openstack-openstack-cell1-nfbh6\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.221174 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:44:35 crc kubenswrapper[4764]: I1204 01:44:35.844500 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-nfbh6"] Dec 04 01:44:36 crc kubenswrapper[4764]: I1204 01:44:36.832406 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" event={"ID":"9b5c6bdf-2afc-44dd-bd15-055fa374edc4","Type":"ContainerStarted","Data":"b8f8e6a3e385ee785dbc8db6bf1a300c77172530efb9de5c4f63245de3503e84"} Dec 04 01:44:36 crc kubenswrapper[4764]: I1204 01:44:36.832966 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" event={"ID":"9b5c6bdf-2afc-44dd-bd15-055fa374edc4","Type":"ContainerStarted","Data":"68f87ac546e889ebaa7384d81a5a32647f99954805a7b728f033e566108e8043"} Dec 04 01:44:36 crc kubenswrapper[4764]: I1204 01:44:36.860074 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" podStartSLOduration=2.450555881 podStartE2EDuration="2.8600539s" podCreationTimestamp="2025-12-04 01:44:34 +0000 UTC" firstStartedPulling="2025-12-04 01:44:35.851325271 +0000 UTC m=+7411.612649722" lastFinishedPulling="2025-12-04 01:44:36.26082329 +0000 UTC m=+7412.022147741" observedRunningTime="2025-12-04 01:44:36.84705366 +0000 UTC m=+7412.608378081" watchObservedRunningTime="2025-12-04 01:44:36.8600539 +0000 UTC m=+7412.621378311" Dec 04 01:44:50 crc kubenswrapper[4764]: I1204 01:44:50.869682 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:44:50 crc kubenswrapper[4764]: I1204 01:44:50.870341 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.157740 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4"] Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.159780 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.161908 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.167753 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.170724 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4"] Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.282465 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.282540 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.282575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xz6t\" (UniqueName: \"kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.384277 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.384332 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.384361 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xz6t\" (UniqueName: \"kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.385550 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.395654 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.412350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xz6t\" (UniqueName: \"kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t\") pod \"collect-profiles-29413545-qcdk4\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.486518 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:00 crc kubenswrapper[4764]: I1204 01:45:00.970189 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4"] Dec 04 01:45:01 crc kubenswrapper[4764]: I1204 01:45:01.428621 4764 generic.go:334] "Generic (PLEG): container finished" podID="7df1ac17-5d6f-4160-aec8-2fed933e366c" containerID="6b045b940c317695064f73cb729fee28dc62c01d9144db2a985ab66c7050c936" exitCode=0 Dec 04 01:45:01 crc kubenswrapper[4764]: I1204 01:45:01.428674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" event={"ID":"7df1ac17-5d6f-4160-aec8-2fed933e366c","Type":"ContainerDied","Data":"6b045b940c317695064f73cb729fee28dc62c01d9144db2a985ab66c7050c936"} Dec 04 01:45:01 crc kubenswrapper[4764]: I1204 01:45:01.428706 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" event={"ID":"7df1ac17-5d6f-4160-aec8-2fed933e366c","Type":"ContainerStarted","Data":"89ddb1acd44ab79f2678e8af9f8f5dbd75a040f068c237ed3561af67fb937e52"} Dec 04 01:45:02 crc kubenswrapper[4764]: I1204 01:45:02.890085 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.075895 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume\") pod \"7df1ac17-5d6f-4160-aec8-2fed933e366c\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.076057 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume\") pod \"7df1ac17-5d6f-4160-aec8-2fed933e366c\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.076098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xz6t\" (UniqueName: \"kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t\") pod \"7df1ac17-5d6f-4160-aec8-2fed933e366c\" (UID: \"7df1ac17-5d6f-4160-aec8-2fed933e366c\") " Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.076660 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume" (OuterVolumeSpecName: "config-volume") pod "7df1ac17-5d6f-4160-aec8-2fed933e366c" (UID: "7df1ac17-5d6f-4160-aec8-2fed933e366c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.082643 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t" (OuterVolumeSpecName: "kube-api-access-8xz6t") pod "7df1ac17-5d6f-4160-aec8-2fed933e366c" (UID: "7df1ac17-5d6f-4160-aec8-2fed933e366c"). InnerVolumeSpecName "kube-api-access-8xz6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.085230 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7df1ac17-5d6f-4160-aec8-2fed933e366c" (UID: "7df1ac17-5d6f-4160-aec8-2fed933e366c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.178654 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7df1ac17-5d6f-4160-aec8-2fed933e366c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.178705 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xz6t\" (UniqueName: \"kubernetes.io/projected/7df1ac17-5d6f-4160-aec8-2fed933e366c-kube-api-access-8xz6t\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.178734 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7df1ac17-5d6f-4160-aec8-2fed933e366c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.463177 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" event={"ID":"7df1ac17-5d6f-4160-aec8-2fed933e366c","Type":"ContainerDied","Data":"89ddb1acd44ab79f2678e8af9f8f5dbd75a040f068c237ed3561af67fb937e52"} Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.463663 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89ddb1acd44ab79f2678e8af9f8f5dbd75a040f068c237ed3561af67fb937e52" Dec 04 01:45:03 crc kubenswrapper[4764]: I1204 01:45:03.463268 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4" Dec 04 01:45:04 crc kubenswrapper[4764]: I1204 01:45:04.003808 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk"] Dec 04 01:45:04 crc kubenswrapper[4764]: I1204 01:45:04.015484 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413500-w95sk"] Dec 04 01:45:04 crc kubenswrapper[4764]: I1204 01:45:04.599465 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784b53ea-54fc-406b-8bc9-51e437063d4c" path="/var/lib/kubelet/pods/784b53ea-54fc-406b-8bc9-51e437063d4c/volumes" Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.652674 4764 generic.go:334] "Generic (PLEG): container finished" podID="9b5c6bdf-2afc-44dd-bd15-055fa374edc4" containerID="b8f8e6a3e385ee785dbc8db6bf1a300c77172530efb9de5c4f63245de3503e84" exitCode=0 Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.652854 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" event={"ID":"9b5c6bdf-2afc-44dd-bd15-055fa374edc4","Type":"ContainerDied","Data":"b8f8e6a3e385ee785dbc8db6bf1a300c77172530efb9de5c4f63245de3503e84"} Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.869154 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.869225 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.869301 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.870324 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:45:20 crc kubenswrapper[4764]: I1204 01:45:20.870420 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" gracePeriod=600 Dec 04 01:45:21 crc kubenswrapper[4764]: E1204 01:45:21.014370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:45:21 crc kubenswrapper[4764]: I1204 01:45:21.441654 4764 scope.go:117] "RemoveContainer" containerID="b14fa99ec99f7a038ec7eae9833282d82da41584777b00c049a8e931eb4eb614" Dec 04 01:45:21 crc kubenswrapper[4764]: I1204 01:45:21.667752 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" exitCode=0 Dec 04 01:45:21 crc kubenswrapper[4764]: I1204 01:45:21.667898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c"} Dec 04 01:45:21 crc kubenswrapper[4764]: I1204 01:45:21.667971 4764 scope.go:117] "RemoveContainer" containerID="2ba34224840d19d3f9bd51f0cb16c27b287d79dec31722954a999e07ba3defd5" Dec 04 01:45:21 crc kubenswrapper[4764]: I1204 01:45:21.668603 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:45:21 crc kubenswrapper[4764]: E1204 01:45:21.668869 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.188161 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.337779 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key\") pod \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.337885 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory\") pod \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.337910 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9jmx\" (UniqueName: \"kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx\") pod \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.337993 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph\") pod \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\" (UID: \"9b5c6bdf-2afc-44dd-bd15-055fa374edc4\") " Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.345071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph" (OuterVolumeSpecName: "ceph") pod "9b5c6bdf-2afc-44dd-bd15-055fa374edc4" (UID: "9b5c6bdf-2afc-44dd-bd15-055fa374edc4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.349479 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx" (OuterVolumeSpecName: "kube-api-access-s9jmx") pod "9b5c6bdf-2afc-44dd-bd15-055fa374edc4" (UID: "9b5c6bdf-2afc-44dd-bd15-055fa374edc4"). InnerVolumeSpecName "kube-api-access-s9jmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.391457 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory" (OuterVolumeSpecName: "inventory") pod "9b5c6bdf-2afc-44dd-bd15-055fa374edc4" (UID: "9b5c6bdf-2afc-44dd-bd15-055fa374edc4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.399149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b5c6bdf-2afc-44dd-bd15-055fa374edc4" (UID: "9b5c6bdf-2afc-44dd-bd15-055fa374edc4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.440384 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.440408 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9jmx\" (UniqueName: \"kubernetes.io/projected/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-kube-api-access-s9jmx\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.440418 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.440425 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b5c6bdf-2afc-44dd-bd15-055fa374edc4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.688666 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" event={"ID":"9b5c6bdf-2afc-44dd-bd15-055fa374edc4","Type":"ContainerDied","Data":"68f87ac546e889ebaa7384d81a5a32647f99954805a7b728f033e566108e8043"} Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.688748 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68f87ac546e889ebaa7384d81a5a32647f99954805a7b728f033e566108e8043" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.688818 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-nfbh6" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.802173 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-brgk5"] Dec 04 01:45:22 crc kubenswrapper[4764]: E1204 01:45:22.803025 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5c6bdf-2afc-44dd-bd15-055fa374edc4" containerName="install-os-openstack-openstack-cell1" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.803046 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5c6bdf-2afc-44dd-bd15-055fa374edc4" containerName="install-os-openstack-openstack-cell1" Dec 04 01:45:22 crc kubenswrapper[4764]: E1204 01:45:22.803067 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df1ac17-5d6f-4160-aec8-2fed933e366c" containerName="collect-profiles" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.803101 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df1ac17-5d6f-4160-aec8-2fed933e366c" containerName="collect-profiles" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.803367 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5c6bdf-2afc-44dd-bd15-055fa374edc4" containerName="install-os-openstack-openstack-cell1" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.803396 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df1ac17-5d6f-4160-aec8-2fed933e366c" containerName="collect-profiles" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.805031 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.807918 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.808205 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.808300 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.814206 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.819617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-brgk5"] Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.951134 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.951308 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.951364 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xfz5\" (UniqueName: \"kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:22 crc kubenswrapper[4764]: I1204 01:45:22.951491 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.054073 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.054440 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.054563 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.054607 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xfz5\" (UniqueName: \"kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.061295 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.062810 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.063015 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.078553 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xfz5\" (UniqueName: \"kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5\") pod \"configure-os-openstack-openstack-cell1-brgk5\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.134312 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:45:23 crc kubenswrapper[4764]: I1204 01:45:23.740949 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-brgk5"] Dec 04 01:45:24 crc kubenswrapper[4764]: I1204 01:45:24.720843 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" event={"ID":"a1e11990-8f5b-4b68-b569-71c8be08628d","Type":"ContainerStarted","Data":"f08a0d41e168c594474e2b0070824302cdb0de3819d3a04bcc2f7bca77c3877a"} Dec 04 01:45:25 crc kubenswrapper[4764]: I1204 01:45:25.770198 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" event={"ID":"a1e11990-8f5b-4b68-b569-71c8be08628d","Type":"ContainerStarted","Data":"1739d9ab15a726a737a7e18d1db91fe89766d19903e26ec0201c299233ca5946"} Dec 04 01:45:25 crc kubenswrapper[4764]: I1204 01:45:25.800617 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" podStartSLOduration=2.441422374 podStartE2EDuration="3.800602169s" podCreationTimestamp="2025-12-04 01:45:22 +0000 UTC" firstStartedPulling="2025-12-04 01:45:23.746788705 +0000 UTC m=+7459.508113156" lastFinishedPulling="2025-12-04 01:45:25.10596853 +0000 UTC m=+7460.867292951" observedRunningTime="2025-12-04 01:45:25.792897689 +0000 UTC m=+7461.554222100" watchObservedRunningTime="2025-12-04 01:45:25.800602169 +0000 UTC m=+7461.561926580" Dec 04 01:45:36 crc kubenswrapper[4764]: I1204 01:45:36.546604 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:45:36 crc kubenswrapper[4764]: E1204 01:45:36.547402 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:45:50 crc kubenswrapper[4764]: I1204 01:45:50.546563 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:45:50 crc kubenswrapper[4764]: E1204 01:45:50.547470 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:03 crc kubenswrapper[4764]: I1204 01:46:03.546621 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:46:03 crc kubenswrapper[4764]: E1204 01:46:03.547638 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:10 crc kubenswrapper[4764]: I1204 01:46:10.261415 4764 generic.go:334] "Generic (PLEG): container finished" podID="a1e11990-8f5b-4b68-b569-71c8be08628d" containerID="1739d9ab15a726a737a7e18d1db91fe89766d19903e26ec0201c299233ca5946" exitCode=0 Dec 04 01:46:10 crc kubenswrapper[4764]: I1204 01:46:10.261485 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" event={"ID":"a1e11990-8f5b-4b68-b569-71c8be08628d","Type":"ContainerDied","Data":"1739d9ab15a726a737a7e18d1db91fe89766d19903e26ec0201c299233ca5946"} Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.796468 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.970421 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xfz5\" (UniqueName: \"kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5\") pod \"a1e11990-8f5b-4b68-b569-71c8be08628d\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.970941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key\") pod \"a1e11990-8f5b-4b68-b569-71c8be08628d\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.971110 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory\") pod \"a1e11990-8f5b-4b68-b569-71c8be08628d\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.971487 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph\") pod \"a1e11990-8f5b-4b68-b569-71c8be08628d\" (UID: \"a1e11990-8f5b-4b68-b569-71c8be08628d\") " Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.976489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph" (OuterVolumeSpecName: "ceph") pod "a1e11990-8f5b-4b68-b569-71c8be08628d" (UID: "a1e11990-8f5b-4b68-b569-71c8be08628d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:11 crc kubenswrapper[4764]: I1204 01:46:11.977285 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5" (OuterVolumeSpecName: "kube-api-access-2xfz5") pod "a1e11990-8f5b-4b68-b569-71c8be08628d" (UID: "a1e11990-8f5b-4b68-b569-71c8be08628d"). InnerVolumeSpecName "kube-api-access-2xfz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.002056 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory" (OuterVolumeSpecName: "inventory") pod "a1e11990-8f5b-4b68-b569-71c8be08628d" (UID: "a1e11990-8f5b-4b68-b569-71c8be08628d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.022889 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a1e11990-8f5b-4b68-b569-71c8be08628d" (UID: "a1e11990-8f5b-4b68-b569-71c8be08628d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.074068 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.074101 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.074113 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xfz5\" (UniqueName: \"kubernetes.io/projected/a1e11990-8f5b-4b68-b569-71c8be08628d-kube-api-access-2xfz5\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.074123 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a1e11990-8f5b-4b68-b569-71c8be08628d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.291703 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" event={"ID":"a1e11990-8f5b-4b68-b569-71c8be08628d","Type":"ContainerDied","Data":"f08a0d41e168c594474e2b0070824302cdb0de3819d3a04bcc2f7bca77c3877a"} Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.291767 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08a0d41e168c594474e2b0070824302cdb0de3819d3a04bcc2f7bca77c3877a" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.291843 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-brgk5" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.391062 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-585vt"] Dec 04 01:46:12 crc kubenswrapper[4764]: E1204 01:46:12.391882 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e11990-8f5b-4b68-b569-71c8be08628d" containerName="configure-os-openstack-openstack-cell1" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.391912 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e11990-8f5b-4b68-b569-71c8be08628d" containerName="configure-os-openstack-openstack-cell1" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.392246 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e11990-8f5b-4b68-b569-71c8be08628d" containerName="configure-os-openstack-openstack-cell1" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.393490 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.396213 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.396367 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.396865 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.408224 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.414917 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-585vt"] Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.488459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.488638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.488724 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.488770 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vmm\" (UniqueName: \"kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.591207 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.591801 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vmm\" (UniqueName: \"kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.591931 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.592426 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.598068 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.598223 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.605881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.611262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vmm\" (UniqueName: \"kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm\") pod \"ssh-known-hosts-openstack-585vt\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:12 crc kubenswrapper[4764]: I1204 01:46:12.711863 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:13 crc kubenswrapper[4764]: I1204 01:46:13.325562 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-585vt"] Dec 04 01:46:14 crc kubenswrapper[4764]: I1204 01:46:14.314118 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-585vt" event={"ID":"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020","Type":"ContainerStarted","Data":"a64d1096ddadc6f92f6ab024e7ba449cb92516fa952a59958aa1e2b35c05382c"} Dec 04 01:46:14 crc kubenswrapper[4764]: I1204 01:46:14.314505 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-585vt" event={"ID":"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020","Type":"ContainerStarted","Data":"b8e03f9c5d8c6a1e67a6986602638fefa97412150830688177741c3d2d2a01b9"} Dec 04 01:46:14 crc kubenswrapper[4764]: I1204 01:46:14.344663 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-585vt" podStartSLOduration=1.826165168 podStartE2EDuration="2.344647739s" podCreationTimestamp="2025-12-04 01:46:12 +0000 UTC" firstStartedPulling="2025-12-04 01:46:13.319537317 +0000 UTC m=+7509.080861738" lastFinishedPulling="2025-12-04 01:46:13.838019868 +0000 UTC m=+7509.599344309" observedRunningTime="2025-12-04 01:46:14.341982103 +0000 UTC m=+7510.103306524" watchObservedRunningTime="2025-12-04 01:46:14.344647739 +0000 UTC m=+7510.105972150" Dec 04 01:46:17 crc kubenswrapper[4764]: I1204 01:46:17.546178 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:46:17 crc kubenswrapper[4764]: E1204 01:46:17.546989 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:23 crc kubenswrapper[4764]: I1204 01:46:23.423462 4764 generic.go:334] "Generic (PLEG): container finished" podID="c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" containerID="a64d1096ddadc6f92f6ab024e7ba449cb92516fa952a59958aa1e2b35c05382c" exitCode=0 Dec 04 01:46:23 crc kubenswrapper[4764]: I1204 01:46:23.423523 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-585vt" event={"ID":"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020","Type":"ContainerDied","Data":"a64d1096ddadc6f92f6ab024e7ba449cb92516fa952a59958aa1e2b35c05382c"} Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.878915 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.900178 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79vmm\" (UniqueName: \"kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm\") pod \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.900597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1\") pod \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.900758 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0\") pod \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.900806 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph\") pod \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\" (UID: \"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020\") " Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.905901 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm" (OuterVolumeSpecName: "kube-api-access-79vmm") pod "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" (UID: "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020"). InnerVolumeSpecName "kube-api-access-79vmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.909381 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph" (OuterVolumeSpecName: "ceph") pod "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" (UID: "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.942397 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" (UID: "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:24 crc kubenswrapper[4764]: I1204 01:46:24.955065 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" (UID: "c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.004174 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79vmm\" (UniqueName: \"kubernetes.io/projected/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-kube-api-access-79vmm\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.004283 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.004364 4764 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.004387 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.444293 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-585vt" event={"ID":"c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020","Type":"ContainerDied","Data":"b8e03f9c5d8c6a1e67a6986602638fefa97412150830688177741c3d2d2a01b9"} Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.444338 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e03f9c5d8c6a1e67a6986602638fefa97412150830688177741c3d2d2a01b9" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.444412 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-585vt" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.609011 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-5f8s8"] Dec 04 01:46:25 crc kubenswrapper[4764]: E1204 01:46:25.609896 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" containerName="ssh-known-hosts-openstack" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.609917 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" containerName="ssh-known-hosts-openstack" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.610204 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020" containerName="ssh-known-hosts-openstack" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.611021 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.613251 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.613628 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.613953 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.615938 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.616053 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.616124 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.616158 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhnp\" (UniqueName: \"kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.617173 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.635850 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-5f8s8"] Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.718349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.718406 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjhnp\" (UniqueName: \"kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.718534 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.718686 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.724262 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.726594 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.732049 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.735376 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjhnp\" (UniqueName: \"kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp\") pod \"run-os-openstack-openstack-cell1-5f8s8\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:25 crc kubenswrapper[4764]: I1204 01:46:25.930273 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:26 crc kubenswrapper[4764]: I1204 01:46:26.731340 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-5f8s8"] Dec 04 01:46:27 crc kubenswrapper[4764]: I1204 01:46:27.486639 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" event={"ID":"17743066-ad51-4fa0-ad0b-27b20e412a5a","Type":"ContainerStarted","Data":"8175930b40fb10c63f68f748ead68ccbacdb867d4198f4881a904ea76481872e"} Dec 04 01:46:28 crc kubenswrapper[4764]: I1204 01:46:28.497312 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" event={"ID":"17743066-ad51-4fa0-ad0b-27b20e412a5a","Type":"ContainerStarted","Data":"9c0f3cfaa7706027b69b65f02aa2963c735bf7fb64278928d93b270380e3a595"} Dec 04 01:46:28 crc kubenswrapper[4764]: I1204 01:46:28.520018 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" podStartSLOduration=3.117729635 podStartE2EDuration="3.520001996s" podCreationTimestamp="2025-12-04 01:46:25 +0000 UTC" firstStartedPulling="2025-12-04 01:46:26.71955335 +0000 UTC m=+7522.480877761" lastFinishedPulling="2025-12-04 01:46:27.121825711 +0000 UTC m=+7522.883150122" observedRunningTime="2025-12-04 01:46:28.512200624 +0000 UTC m=+7524.273525035" watchObservedRunningTime="2025-12-04 01:46:28.520001996 +0000 UTC m=+7524.281326407" Dec 04 01:46:32 crc kubenswrapper[4764]: I1204 01:46:32.546282 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:46:32 crc kubenswrapper[4764]: E1204 01:46:32.547396 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:36 crc kubenswrapper[4764]: I1204 01:46:36.612287 4764 generic.go:334] "Generic (PLEG): container finished" podID="17743066-ad51-4fa0-ad0b-27b20e412a5a" containerID="9c0f3cfaa7706027b69b65f02aa2963c735bf7fb64278928d93b270380e3a595" exitCode=0 Dec 04 01:46:36 crc kubenswrapper[4764]: I1204 01:46:36.612454 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" event={"ID":"17743066-ad51-4fa0-ad0b-27b20e412a5a","Type":"ContainerDied","Data":"9c0f3cfaa7706027b69b65f02aa2963c735bf7fb64278928d93b270380e3a595"} Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.204710 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.331587 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key\") pod \"17743066-ad51-4fa0-ad0b-27b20e412a5a\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.332112 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory\") pod \"17743066-ad51-4fa0-ad0b-27b20e412a5a\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.332262 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph\") pod \"17743066-ad51-4fa0-ad0b-27b20e412a5a\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.332467 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjhnp\" (UniqueName: \"kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp\") pod \"17743066-ad51-4fa0-ad0b-27b20e412a5a\" (UID: \"17743066-ad51-4fa0-ad0b-27b20e412a5a\") " Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.339099 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph" (OuterVolumeSpecName: "ceph") pod "17743066-ad51-4fa0-ad0b-27b20e412a5a" (UID: "17743066-ad51-4fa0-ad0b-27b20e412a5a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.347446 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp" (OuterVolumeSpecName: "kube-api-access-pjhnp") pod "17743066-ad51-4fa0-ad0b-27b20e412a5a" (UID: "17743066-ad51-4fa0-ad0b-27b20e412a5a"). InnerVolumeSpecName "kube-api-access-pjhnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.381764 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory" (OuterVolumeSpecName: "inventory") pod "17743066-ad51-4fa0-ad0b-27b20e412a5a" (UID: "17743066-ad51-4fa0-ad0b-27b20e412a5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.385840 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17743066-ad51-4fa0-ad0b-27b20e412a5a" (UID: "17743066-ad51-4fa0-ad0b-27b20e412a5a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.435221 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.435259 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.435274 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjhnp\" (UniqueName: \"kubernetes.io/projected/17743066-ad51-4fa0-ad0b-27b20e412a5a-kube-api-access-pjhnp\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.435288 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17743066-ad51-4fa0-ad0b-27b20e412a5a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.638985 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" event={"ID":"17743066-ad51-4fa0-ad0b-27b20e412a5a","Type":"ContainerDied","Data":"8175930b40fb10c63f68f748ead68ccbacdb867d4198f4881a904ea76481872e"} Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.639030 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8175930b40fb10c63f68f748ead68ccbacdb867d4198f4881a904ea76481872e" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.639502 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-5f8s8" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.714905 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-l2n82"] Dec 04 01:46:38 crc kubenswrapper[4764]: E1204 01:46:38.715347 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17743066-ad51-4fa0-ad0b-27b20e412a5a" containerName="run-os-openstack-openstack-cell1" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.715371 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="17743066-ad51-4fa0-ad0b-27b20e412a5a" containerName="run-os-openstack-openstack-cell1" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.715596 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="17743066-ad51-4fa0-ad0b-27b20e412a5a" containerName="run-os-openstack-openstack-cell1" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.720542 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.722503 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.722590 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.723331 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.738109 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.746669 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-l2n82"] Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.845886 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.845957 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-786xg\" (UniqueName: \"kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.846359 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.846459 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.949223 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.949346 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.949540 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.949600 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-786xg\" (UniqueName: \"kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.954881 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.956903 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.961100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:38 crc kubenswrapper[4764]: I1204 01:46:38.972510 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-786xg\" (UniqueName: \"kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg\") pod \"reboot-os-openstack-openstack-cell1-l2n82\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:39 crc kubenswrapper[4764]: I1204 01:46:39.070401 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:39 crc kubenswrapper[4764]: I1204 01:46:39.617236 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-l2n82"] Dec 04 01:46:39 crc kubenswrapper[4764]: W1204 01:46:39.620916 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb653e7a_4650_4b7c_a875_6773b4db51c6.slice/crio-d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df WatchSource:0}: Error finding container d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df: Status 404 returned error can't find the container with id d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df Dec 04 01:46:39 crc kubenswrapper[4764]: I1204 01:46:39.649807 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" event={"ID":"fb653e7a-4650-4b7c-a875-6773b4db51c6","Type":"ContainerStarted","Data":"d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df"} Dec 04 01:46:40 crc kubenswrapper[4764]: I1204 01:46:40.663060 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" event={"ID":"fb653e7a-4650-4b7c-a875-6773b4db51c6","Type":"ContainerStarted","Data":"a8d2a5b1bab57f51d01914d2658893f5182bb03a921d45b28f33f3d9f156d90b"} Dec 04 01:46:40 crc kubenswrapper[4764]: I1204 01:46:40.689782 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" podStartSLOduration=2.231449466 podStartE2EDuration="2.689750987s" podCreationTimestamp="2025-12-04 01:46:38 +0000 UTC" firstStartedPulling="2025-12-04 01:46:39.623608934 +0000 UTC m=+7535.384933345" lastFinishedPulling="2025-12-04 01:46:40.081910455 +0000 UTC m=+7535.843234866" observedRunningTime="2025-12-04 01:46:40.680306174 +0000 UTC m=+7536.441630585" watchObservedRunningTime="2025-12-04 01:46:40.689750987 +0000 UTC m=+7536.451075408" Dec 04 01:46:44 crc kubenswrapper[4764]: I1204 01:46:44.566414 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:46:44 crc kubenswrapper[4764]: E1204 01:46:44.567793 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:56 crc kubenswrapper[4764]: I1204 01:46:56.892419 4764 generic.go:334] "Generic (PLEG): container finished" podID="fb653e7a-4650-4b7c-a875-6773b4db51c6" containerID="a8d2a5b1bab57f51d01914d2658893f5182bb03a921d45b28f33f3d9f156d90b" exitCode=0 Dec 04 01:46:56 crc kubenswrapper[4764]: I1204 01:46:56.892535 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" event={"ID":"fb653e7a-4650-4b7c-a875-6773b4db51c6","Type":"ContainerDied","Data":"a8d2a5b1bab57f51d01914d2658893f5182bb03a921d45b28f33f3d9f156d90b"} Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.521544 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.702056 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory\") pod \"fb653e7a-4650-4b7c-a875-6773b4db51c6\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.702283 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph\") pod \"fb653e7a-4650-4b7c-a875-6773b4db51c6\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.702383 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key\") pod \"fb653e7a-4650-4b7c-a875-6773b4db51c6\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.702530 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-786xg\" (UniqueName: \"kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg\") pod \"fb653e7a-4650-4b7c-a875-6773b4db51c6\" (UID: \"fb653e7a-4650-4b7c-a875-6773b4db51c6\") " Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.709028 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg" (OuterVolumeSpecName: "kube-api-access-786xg") pod "fb653e7a-4650-4b7c-a875-6773b4db51c6" (UID: "fb653e7a-4650-4b7c-a875-6773b4db51c6"). InnerVolumeSpecName "kube-api-access-786xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.713928 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph" (OuterVolumeSpecName: "ceph") pod "fb653e7a-4650-4b7c-a875-6773b4db51c6" (UID: "fb653e7a-4650-4b7c-a875-6773b4db51c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.730320 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory" (OuterVolumeSpecName: "inventory") pod "fb653e7a-4650-4b7c-a875-6773b4db51c6" (UID: "fb653e7a-4650-4b7c-a875-6773b4db51c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.740602 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fb653e7a-4650-4b7c-a875-6773b4db51c6" (UID: "fb653e7a-4650-4b7c-a875-6773b4db51c6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.807707 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.807804 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.807817 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fb653e7a-4650-4b7c-a875-6773b4db51c6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.807830 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-786xg\" (UniqueName: \"kubernetes.io/projected/fb653e7a-4650-4b7c-a875-6773b4db51c6-kube-api-access-786xg\") on node \"crc\" DevicePath \"\"" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.918062 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" event={"ID":"fb653e7a-4650-4b7c-a875-6773b4db51c6","Type":"ContainerDied","Data":"d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df"} Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.918428 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8745e4e32cac1f9dfc5b416f63f038b5b2acb9d0127d8ac61823e21f401a6df" Dec 04 01:46:58 crc kubenswrapper[4764]: I1204 01:46:58.918521 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-l2n82" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.027827 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-75zbq"] Dec 04 01:46:59 crc kubenswrapper[4764]: E1204 01:46:59.028487 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb653e7a-4650-4b7c-a875-6773b4db51c6" containerName="reboot-os-openstack-openstack-cell1" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.028516 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb653e7a-4650-4b7c-a875-6773b4db51c6" containerName="reboot-os-openstack-openstack-cell1" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.028921 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb653e7a-4650-4b7c-a875-6773b4db51c6" containerName="reboot-os-openstack-openstack-cell1" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.029966 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.032685 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.032976 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.033213 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.033378 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.040225 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-75zbq"] Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113584 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113664 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5d9\" (UniqueName: \"kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113689 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113776 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113824 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113849 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113876 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113905 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113920 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113941 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.113987 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.114014 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.216815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.217297 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.217575 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.217921 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5d9\" (UniqueName: \"kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.218176 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.218537 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.218884 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.219164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.219443 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.219687 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.219965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.220220 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.225657 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.226368 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.226748 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.227016 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.227139 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.228151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.228421 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.228492 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.228508 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.229523 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.232842 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.242633 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5d9\" (UniqueName: \"kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9\") pod \"install-certs-openstack-openstack-cell1-75zbq\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.346278 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.547768 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:46:59 crc kubenswrapper[4764]: E1204 01:46:59.548308 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:46:59 crc kubenswrapper[4764]: I1204 01:46:59.944478 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-75zbq"] Dec 04 01:47:00 crc kubenswrapper[4764]: I1204 01:47:00.946329 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" event={"ID":"1c4de713-de0e-456d-9015-b2997b2ab3e1","Type":"ContainerStarted","Data":"82089f9e9698eb15195f8d024dd342e741a8bd45f3519ba85231664a864dd9eb"} Dec 04 01:47:00 crc kubenswrapper[4764]: I1204 01:47:00.946886 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" event={"ID":"1c4de713-de0e-456d-9015-b2997b2ab3e1","Type":"ContainerStarted","Data":"0f4a4e68176d0f1089cc69df2a60028018daf0472a82d9a042476a3e1687db81"} Dec 04 01:47:14 crc kubenswrapper[4764]: I1204 01:47:14.554654 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:47:14 crc kubenswrapper[4764]: E1204 01:47:14.555615 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:47:20 crc kubenswrapper[4764]: I1204 01:47:20.171880 4764 generic.go:334] "Generic (PLEG): container finished" podID="1c4de713-de0e-456d-9015-b2997b2ab3e1" containerID="82089f9e9698eb15195f8d024dd342e741a8bd45f3519ba85231664a864dd9eb" exitCode=0 Dec 04 01:47:20 crc kubenswrapper[4764]: I1204 01:47:20.171965 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" event={"ID":"1c4de713-de0e-456d-9015-b2997b2ab3e1","Type":"ContainerDied","Data":"82089f9e9698eb15195f8d024dd342e741a8bd45f3519ba85231664a864dd9eb"} Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.530927 4764 scope.go:117] "RemoveContainer" containerID="c4ce06187ffc50809a1fbdb3115b5f2a49f5ca153839b51443cd5e7a2a3cde5b" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.564041 4764 scope.go:117] "RemoveContainer" containerID="c9efa9f5288b31f38f2f4d751f7774ba6feb295d70b057cf54a86f14be36adc3" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.630103 4764 scope.go:117] "RemoveContainer" containerID="72e99312ac930679368e7b975d1a173a821a1f30420a9014c72fb86f2d9ae624" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.720786 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.766828 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.766914 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.766997 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767073 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767187 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767227 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767367 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767433 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767538 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767644 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.767822 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn5d9\" (UniqueName: \"kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9\") pod \"1c4de713-de0e-456d-9015-b2997b2ab3e1\" (UID: \"1c4de713-de0e-456d-9015-b2997b2ab3e1\") " Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.781286 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph" (OuterVolumeSpecName: "ceph") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.784918 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.789222 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.789521 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.789650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.789852 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.790001 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9" (OuterVolumeSpecName: "kube-api-access-hn5d9") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "kube-api-access-hn5d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.798131 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.804047 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.806632 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.837917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory" (OuterVolumeSpecName: "inventory") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.848973 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1c4de713-de0e-456d-9015-b2997b2ab3e1" (UID: "1c4de713-de0e-456d-9015-b2997b2ab3e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870425 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn5d9\" (UniqueName: \"kubernetes.io/projected/1c4de713-de0e-456d-9015-b2997b2ab3e1-kube-api-access-hn5d9\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870458 4764 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870469 4764 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870478 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870489 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870500 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870511 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870520 4764 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870529 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870537 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870547 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:21 crc kubenswrapper[4764]: I1204 01:47:21.870558 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c4de713-de0e-456d-9015-b2997b2ab3e1-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.192857 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" event={"ID":"1c4de713-de0e-456d-9015-b2997b2ab3e1","Type":"ContainerDied","Data":"0f4a4e68176d0f1089cc69df2a60028018daf0472a82d9a042476a3e1687db81"} Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.192895 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-75zbq" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.192905 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f4a4e68176d0f1089cc69df2a60028018daf0472a82d9a042476a3e1687db81" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.308370 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-95f2v"] Dec 04 01:47:22 crc kubenswrapper[4764]: E1204 01:47:22.308883 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c4de713-de0e-456d-9015-b2997b2ab3e1" containerName="install-certs-openstack-openstack-cell1" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.308898 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c4de713-de0e-456d-9015-b2997b2ab3e1" containerName="install-certs-openstack-openstack-cell1" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.309118 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c4de713-de0e-456d-9015-b2997b2ab3e1" containerName="install-certs-openstack-openstack-cell1" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.309920 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.313964 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.313983 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.314250 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.314369 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.323267 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-95f2v"] Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.380471 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrm89\" (UniqueName: \"kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.380917 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.381055 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.381270 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.483693 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.484020 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.484134 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrm89\" (UniqueName: \"kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.484397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.489038 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.489293 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.494501 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.506331 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrm89\" (UniqueName: \"kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89\") pod \"ceph-client-openstack-openstack-cell1-95f2v\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:22 crc kubenswrapper[4764]: I1204 01:47:22.634673 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:23 crc kubenswrapper[4764]: I1204 01:47:23.247274 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-95f2v"] Dec 04 01:47:24 crc kubenswrapper[4764]: I1204 01:47:24.221226 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" event={"ID":"4d57a9e3-2efc-449f-8d83-5b42ce3642c1","Type":"ContainerStarted","Data":"4d0687e39686e4db3c08fa7ba21c7810c6102402c4671639f3eed578f3b58063"} Dec 04 01:47:24 crc kubenswrapper[4764]: I1204 01:47:24.221676 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" event={"ID":"4d57a9e3-2efc-449f-8d83-5b42ce3642c1","Type":"ContainerStarted","Data":"4275eefc7d26ba9fcf650bb5f46115fd9f93d2a68427aabe44726794421abd48"} Dec 04 01:47:24 crc kubenswrapper[4764]: I1204 01:47:24.258436 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" podStartSLOduration=1.587915234 podStartE2EDuration="2.258420295s" podCreationTimestamp="2025-12-04 01:47:22 +0000 UTC" firstStartedPulling="2025-12-04 01:47:23.251912776 +0000 UTC m=+7579.013237187" lastFinishedPulling="2025-12-04 01:47:23.922417807 +0000 UTC m=+7579.683742248" observedRunningTime="2025-12-04 01:47:24.247554298 +0000 UTC m=+7580.008878729" watchObservedRunningTime="2025-12-04 01:47:24.258420295 +0000 UTC m=+7580.019744706" Dec 04 01:47:26 crc kubenswrapper[4764]: I1204 01:47:26.547409 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:47:26 crc kubenswrapper[4764]: E1204 01:47:26.548554 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:47:29 crc kubenswrapper[4764]: I1204 01:47:29.277138 4764 generic.go:334] "Generic (PLEG): container finished" podID="4d57a9e3-2efc-449f-8d83-5b42ce3642c1" containerID="4d0687e39686e4db3c08fa7ba21c7810c6102402c4671639f3eed578f3b58063" exitCode=0 Dec 04 01:47:29 crc kubenswrapper[4764]: I1204 01:47:29.277242 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" event={"ID":"4d57a9e3-2efc-449f-8d83-5b42ce3642c1","Type":"ContainerDied","Data":"4d0687e39686e4db3c08fa7ba21c7810c6102402c4671639f3eed578f3b58063"} Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.896858 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.982154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrm89\" (UniqueName: \"kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89\") pod \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.982327 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key\") pod \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.982814 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory\") pod \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.982859 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph\") pod \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\" (UID: \"4d57a9e3-2efc-449f-8d83-5b42ce3642c1\") " Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.988104 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph" (OuterVolumeSpecName: "ceph") pod "4d57a9e3-2efc-449f-8d83-5b42ce3642c1" (UID: "4d57a9e3-2efc-449f-8d83-5b42ce3642c1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:30 crc kubenswrapper[4764]: I1204 01:47:30.988135 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89" (OuterVolumeSpecName: "kube-api-access-wrm89") pod "4d57a9e3-2efc-449f-8d83-5b42ce3642c1" (UID: "4d57a9e3-2efc-449f-8d83-5b42ce3642c1"). InnerVolumeSpecName "kube-api-access-wrm89". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.017364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory" (OuterVolumeSpecName: "inventory") pod "4d57a9e3-2efc-449f-8d83-5b42ce3642c1" (UID: "4d57a9e3-2efc-449f-8d83-5b42ce3642c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.025953 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d57a9e3-2efc-449f-8d83-5b42ce3642c1" (UID: "4d57a9e3-2efc-449f-8d83-5b42ce3642c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.085776 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrm89\" (UniqueName: \"kubernetes.io/projected/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-kube-api-access-wrm89\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.085818 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.085831 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.085843 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4d57a9e3-2efc-449f-8d83-5b42ce3642c1-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.303670 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" event={"ID":"4d57a9e3-2efc-449f-8d83-5b42ce3642c1","Type":"ContainerDied","Data":"4275eefc7d26ba9fcf650bb5f46115fd9f93d2a68427aabe44726794421abd48"} Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.303967 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4275eefc7d26ba9fcf650bb5f46115fd9f93d2a68427aabe44726794421abd48" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.303794 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-95f2v" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.402242 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g5wgp"] Dec 04 01:47:31 crc kubenswrapper[4764]: E1204 01:47:31.402793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d57a9e3-2efc-449f-8d83-5b42ce3642c1" containerName="ceph-client-openstack-openstack-cell1" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.402813 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d57a9e3-2efc-449f-8d83-5b42ce3642c1" containerName="ceph-client-openstack-openstack-cell1" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.403097 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d57a9e3-2efc-449f-8d83-5b42ce3642c1" containerName="ceph-client-openstack-openstack-cell1" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.404010 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.409558 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.413027 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.413113 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.413228 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.413236 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.421188 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g5wgp"] Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.494577 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.494678 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8z2\" (UniqueName: \"kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.494800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.494966 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.495041 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.495243 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.597637 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8z2\" (UniqueName: \"kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.597757 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.597815 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.597852 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.597923 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.598026 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.599920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.603385 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.603685 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.604180 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.616307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.624517 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8z2\" (UniqueName: \"kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2\") pod \"ovn-openstack-openstack-cell1-g5wgp\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:31 crc kubenswrapper[4764]: I1204 01:47:31.724888 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:47:32 crc kubenswrapper[4764]: I1204 01:47:32.289601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-g5wgp"] Dec 04 01:47:32 crc kubenswrapper[4764]: W1204 01:47:32.297844 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b4ad484_2065_4ff3_9c95_30391bbec966.slice/crio-8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1 WatchSource:0}: Error finding container 8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1: Status 404 returned error can't find the container with id 8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1 Dec 04 01:47:32 crc kubenswrapper[4764]: I1204 01:47:32.330266 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" event={"ID":"1b4ad484-2065-4ff3-9c95-30391bbec966","Type":"ContainerStarted","Data":"8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1"} Dec 04 01:47:33 crc kubenswrapper[4764]: I1204 01:47:33.344276 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" event={"ID":"1b4ad484-2065-4ff3-9c95-30391bbec966","Type":"ContainerStarted","Data":"2664d37fe2bc663f66fe7d8ed2f9a28fe599967800c9602eb6c6fce2a0e6cf92"} Dec 04 01:47:33 crc kubenswrapper[4764]: I1204 01:47:33.393895 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" podStartSLOduration=1.706394811 podStartE2EDuration="2.393861337s" podCreationTimestamp="2025-12-04 01:47:31 +0000 UTC" firstStartedPulling="2025-12-04 01:47:32.316153191 +0000 UTC m=+7588.077477602" lastFinishedPulling="2025-12-04 01:47:33.003619677 +0000 UTC m=+7588.764944128" observedRunningTime="2025-12-04 01:47:33.371224101 +0000 UTC m=+7589.132548562" watchObservedRunningTime="2025-12-04 01:47:33.393861337 +0000 UTC m=+7589.155185798" Dec 04 01:47:39 crc kubenswrapper[4764]: I1204 01:47:39.546240 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:47:39 crc kubenswrapper[4764]: E1204 01:47:39.547024 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:47:50 crc kubenswrapper[4764]: I1204 01:47:50.546687 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:47:50 crc kubenswrapper[4764]: E1204 01:47:50.548079 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.951197 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.954875 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.961369 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vzl\" (UniqueName: \"kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.961591 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.961636 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:53 crc kubenswrapper[4764]: I1204 01:47:53.971390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.063555 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.063797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.063971 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vzl\" (UniqueName: \"kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.064135 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.064438 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.088214 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vzl\" (UniqueName: \"kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl\") pod \"redhat-operators-5x52b\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.276608 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:47:54 crc kubenswrapper[4764]: I1204 01:47:54.784354 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:47:55 crc kubenswrapper[4764]: I1204 01:47:55.587155 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerID="3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001" exitCode=0 Dec 04 01:47:55 crc kubenswrapper[4764]: I1204 01:47:55.587203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerDied","Data":"3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001"} Dec 04 01:47:55 crc kubenswrapper[4764]: I1204 01:47:55.587537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerStarted","Data":"f48512b3770ffb67494a1ec36d294b12cc55e1436f2f9df9f9563189d8f658c8"} Dec 04 01:47:56 crc kubenswrapper[4764]: I1204 01:47:56.604817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerStarted","Data":"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313"} Dec 04 01:47:59 crc kubenswrapper[4764]: I1204 01:47:59.641237 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerID="83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313" exitCode=0 Dec 04 01:47:59 crc kubenswrapper[4764]: I1204 01:47:59.641321 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerDied","Data":"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313"} Dec 04 01:48:00 crc kubenswrapper[4764]: I1204 01:48:00.655692 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerStarted","Data":"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe"} Dec 04 01:48:00 crc kubenswrapper[4764]: I1204 01:48:00.675128 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5x52b" podStartSLOduration=2.953024143 podStartE2EDuration="7.675107244s" podCreationTimestamp="2025-12-04 01:47:53 +0000 UTC" firstStartedPulling="2025-12-04 01:47:55.590016621 +0000 UTC m=+7611.351341042" lastFinishedPulling="2025-12-04 01:48:00.312099702 +0000 UTC m=+7616.073424143" observedRunningTime="2025-12-04 01:48:00.67168628 +0000 UTC m=+7616.433010691" watchObservedRunningTime="2025-12-04 01:48:00.675107244 +0000 UTC m=+7616.436431655" Dec 04 01:48:01 crc kubenswrapper[4764]: I1204 01:48:01.547042 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:48:01 crc kubenswrapper[4764]: E1204 01:48:01.547626 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:48:04 crc kubenswrapper[4764]: I1204 01:48:04.277316 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:04 crc kubenswrapper[4764]: I1204 01:48:04.278102 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:05 crc kubenswrapper[4764]: I1204 01:48:05.332216 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5x52b" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="registry-server" probeResult="failure" output=< Dec 04 01:48:05 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:48:05 crc kubenswrapper[4764]: > Dec 04 01:48:14 crc kubenswrapper[4764]: I1204 01:48:14.324341 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:14 crc kubenswrapper[4764]: I1204 01:48:14.373827 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:14 crc kubenswrapper[4764]: I1204 01:48:14.558957 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:48:14 crc kubenswrapper[4764]: E1204 01:48:14.559968 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:48:14 crc kubenswrapper[4764]: I1204 01:48:14.572478 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:48:15 crc kubenswrapper[4764]: I1204 01:48:15.826445 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5x52b" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="registry-server" containerID="cri-o://79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe" gracePeriod=2 Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.377074 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.577202 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities\") pod \"7a602ce6-19a8-4a6c-8334-5f805276b554\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.577524 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92vzl\" (UniqueName: \"kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl\") pod \"7a602ce6-19a8-4a6c-8334-5f805276b554\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.577904 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content\") pod \"7a602ce6-19a8-4a6c-8334-5f805276b554\" (UID: \"7a602ce6-19a8-4a6c-8334-5f805276b554\") " Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.578967 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities" (OuterVolumeSpecName: "utilities") pod "7a602ce6-19a8-4a6c-8334-5f805276b554" (UID: "7a602ce6-19a8-4a6c-8334-5f805276b554"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.589992 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl" (OuterVolumeSpecName: "kube-api-access-92vzl") pod "7a602ce6-19a8-4a6c-8334-5f805276b554" (UID: "7a602ce6-19a8-4a6c-8334-5f805276b554"). InnerVolumeSpecName "kube-api-access-92vzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.681591 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.681631 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92vzl\" (UniqueName: \"kubernetes.io/projected/7a602ce6-19a8-4a6c-8334-5f805276b554-kube-api-access-92vzl\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.749891 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a602ce6-19a8-4a6c-8334-5f805276b554" (UID: "7a602ce6-19a8-4a6c-8334-5f805276b554"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.783663 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a602ce6-19a8-4a6c-8334-5f805276b554-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.839538 4764 generic.go:334] "Generic (PLEG): container finished" podID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerID="79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe" exitCode=0 Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.839770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerDied","Data":"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe"} Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.839812 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5x52b" event={"ID":"7a602ce6-19a8-4a6c-8334-5f805276b554","Type":"ContainerDied","Data":"f48512b3770ffb67494a1ec36d294b12cc55e1436f2f9df9f9563189d8f658c8"} Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.839830 4764 scope.go:117] "RemoveContainer" containerID="79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.839849 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5x52b" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.861170 4764 scope.go:117] "RemoveContainer" containerID="83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.887073 4764 scope.go:117] "RemoveContainer" containerID="3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.894077 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.916911 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5x52b"] Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.941917 4764 scope.go:117] "RemoveContainer" containerID="79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe" Dec 04 01:48:16 crc kubenswrapper[4764]: E1204 01:48:16.942931 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe\": container with ID starting with 79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe not found: ID does not exist" containerID="79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.943005 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe"} err="failed to get container status \"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe\": rpc error: code = NotFound desc = could not find container \"79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe\": container with ID starting with 79bfe972d6966f7bede60af933f66fd4257724b4eb38507eef6827c409427dfe not found: ID does not exist" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.943031 4764 scope.go:117] "RemoveContainer" containerID="83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313" Dec 04 01:48:16 crc kubenswrapper[4764]: E1204 01:48:16.943446 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313\": container with ID starting with 83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313 not found: ID does not exist" containerID="83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.943474 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313"} err="failed to get container status \"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313\": rpc error: code = NotFound desc = could not find container \"83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313\": container with ID starting with 83973d0ec72a9d903e3ca69c7d6195068dccf2b919fbba7f255d8c7c726a5313 not found: ID does not exist" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.943501 4764 scope.go:117] "RemoveContainer" containerID="3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001" Dec 04 01:48:16 crc kubenswrapper[4764]: E1204 01:48:16.944009 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001\": container with ID starting with 3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001 not found: ID does not exist" containerID="3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001" Dec 04 01:48:16 crc kubenswrapper[4764]: I1204 01:48:16.944107 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001"} err="failed to get container status \"3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001\": rpc error: code = NotFound desc = could not find container \"3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001\": container with ID starting with 3afdb4e6ccea567fad3bfb65ea35f07701fa8b9d0cce56b971ad8fda47b7a001 not found: ID does not exist" Dec 04 01:48:18 crc kubenswrapper[4764]: I1204 01:48:18.565176 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" path="/var/lib/kubelet/pods/7a602ce6-19a8-4a6c-8334-5f805276b554/volumes" Dec 04 01:48:28 crc kubenswrapper[4764]: I1204 01:48:28.549225 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:48:28 crc kubenswrapper[4764]: E1204 01:48:28.550199 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:48:40 crc kubenswrapper[4764]: I1204 01:48:40.546898 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:48:40 crc kubenswrapper[4764]: E1204 01:48:40.548084 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:48:41 crc kubenswrapper[4764]: I1204 01:48:41.108184 4764 generic.go:334] "Generic (PLEG): container finished" podID="1b4ad484-2065-4ff3-9c95-30391bbec966" containerID="2664d37fe2bc663f66fe7d8ed2f9a28fe599967800c9602eb6c6fce2a0e6cf92" exitCode=0 Dec 04 01:48:41 crc kubenswrapper[4764]: I1204 01:48:41.108239 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" event={"ID":"1b4ad484-2065-4ff3-9c95-30391bbec966","Type":"ContainerDied","Data":"2664d37fe2bc663f66fe7d8ed2f9a28fe599967800c9602eb6c6fce2a0e6cf92"} Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.644817 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679445 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679511 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679572 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw8z2\" (UniqueName: \"kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679751 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679818 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.679876 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle\") pod \"1b4ad484-2065-4ff3-9c95-30391bbec966\" (UID: \"1b4ad484-2065-4ff3-9c95-30391bbec966\") " Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.685917 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph" (OuterVolumeSpecName: "ceph") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.686043 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.687349 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2" (OuterVolumeSpecName: "kube-api-access-zw8z2") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "kube-api-access-zw8z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.717749 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.729236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.735937 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory" (OuterVolumeSpecName: "inventory") pod "1b4ad484-2065-4ff3-9c95-30391bbec966" (UID: "1b4ad484-2065-4ff3-9c95-30391bbec966"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781760 4764 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/1b4ad484-2065-4ff3-9c95-30391bbec966-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781793 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781802 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781811 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781819 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1b4ad484-2065-4ff3-9c95-30391bbec966-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:42 crc kubenswrapper[4764]: I1204 01:48:42.781827 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw8z2\" (UniqueName: \"kubernetes.io/projected/1b4ad484-2065-4ff3-9c95-30391bbec966-kube-api-access-zw8z2\") on node \"crc\" DevicePath \"\"" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.128772 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" event={"ID":"1b4ad484-2065-4ff3-9c95-30391bbec966","Type":"ContainerDied","Data":"8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1"} Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.128832 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8681bb1dd917ccb49447a8be9777634c6d9d1039b3dd3efabbb6d79a2f2537a1" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.128898 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-g5wgp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.253208 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-qx7bp"] Dec 04 01:48:43 crc kubenswrapper[4764]: E1204 01:48:43.253781 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="registry-server" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.253803 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="registry-server" Dec 04 01:48:43 crc kubenswrapper[4764]: E1204 01:48:43.253822 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="extract-content" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.253831 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="extract-content" Dec 04 01:48:43 crc kubenswrapper[4764]: E1204 01:48:43.253867 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b4ad484-2065-4ff3-9c95-30391bbec966" containerName="ovn-openstack-openstack-cell1" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.253877 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b4ad484-2065-4ff3-9c95-30391bbec966" containerName="ovn-openstack-openstack-cell1" Dec 04 01:48:43 crc kubenswrapper[4764]: E1204 01:48:43.253898 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="extract-utilities" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.253906 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="extract-utilities" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.254170 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b4ad484-2065-4ff3-9c95-30391bbec966" containerName="ovn-openstack-openstack-cell1" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.254193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a602ce6-19a8-4a6c-8334-5f805276b554" containerName="registry-server" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.255117 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.260261 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.260480 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.260764 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.261066 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.261362 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.261591 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.273512 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-qx7bp"] Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.295520 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.295707 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvz9b\" (UniqueName: \"kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.295909 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.296305 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.296638 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.296732 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.296772 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.398526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.398904 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.398994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.399019 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.399046 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.399117 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.399155 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvz9b\" (UniqueName: \"kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.402859 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.402876 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.402870 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.403537 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.407930 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.409067 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.418339 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvz9b\" (UniqueName: \"kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b\") pod \"neutron-metadata-openstack-openstack-cell1-qx7bp\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:43 crc kubenswrapper[4764]: I1204 01:48:43.601483 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:48:44 crc kubenswrapper[4764]: W1204 01:48:44.188891 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b5c139_4e32_4f95_b7a9_9e33f4d5a1d3.slice/crio-4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e WatchSource:0}: Error finding container 4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e: Status 404 returned error can't find the container with id 4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e Dec 04 01:48:44 crc kubenswrapper[4764]: I1204 01:48:44.192351 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:48:44 crc kubenswrapper[4764]: I1204 01:48:44.211037 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-qx7bp"] Dec 04 01:48:45 crc kubenswrapper[4764]: I1204 01:48:45.147921 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" event={"ID":"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3","Type":"ContainerStarted","Data":"58ce47b70de70205c30cacfdc4c3c8dc597db83a5d1bc605a98d63c9a2c78111"} Dec 04 01:48:45 crc kubenswrapper[4764]: I1204 01:48:45.147963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" event={"ID":"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3","Type":"ContainerStarted","Data":"4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e"} Dec 04 01:48:45 crc kubenswrapper[4764]: I1204 01:48:45.189312 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" podStartSLOduration=1.675231074 podStartE2EDuration="2.189274833s" podCreationTimestamp="2025-12-04 01:48:43 +0000 UTC" firstStartedPulling="2025-12-04 01:48:44.192103713 +0000 UTC m=+7659.953428124" lastFinishedPulling="2025-12-04 01:48:44.706147452 +0000 UTC m=+7660.467471883" observedRunningTime="2025-12-04 01:48:45.164872094 +0000 UTC m=+7660.926196515" watchObservedRunningTime="2025-12-04 01:48:45.189274833 +0000 UTC m=+7660.950599284" Dec 04 01:48:52 crc kubenswrapper[4764]: I1204 01:48:52.545775 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:48:52 crc kubenswrapper[4764]: E1204 01:48:52.547410 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:49:03 crc kubenswrapper[4764]: I1204 01:49:03.546405 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:49:03 crc kubenswrapper[4764]: E1204 01:49:03.547625 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:49:14 crc kubenswrapper[4764]: I1204 01:49:14.556762 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:49:14 crc kubenswrapper[4764]: E1204 01:49:14.557485 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:49:28 crc kubenswrapper[4764]: I1204 01:49:28.546346 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:49:28 crc kubenswrapper[4764]: E1204 01:49:28.547048 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:49:40 crc kubenswrapper[4764]: I1204 01:49:40.783969 4764 generic.go:334] "Generic (PLEG): container finished" podID="e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" containerID="58ce47b70de70205c30cacfdc4c3c8dc597db83a5d1bc605a98d63c9a2c78111" exitCode=0 Dec 04 01:49:40 crc kubenswrapper[4764]: I1204 01:49:40.784088 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" event={"ID":"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3","Type":"ContainerDied","Data":"58ce47b70de70205c30cacfdc4c3c8dc597db83a5d1bc605a98d63c9a2c78111"} Dec 04 01:49:41 crc kubenswrapper[4764]: I1204 01:49:41.546784 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:49:41 crc kubenswrapper[4764]: E1204 01:49:41.548016 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.399567 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.543534 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.543749 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.543795 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.543866 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.543941 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.544085 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvz9b\" (UniqueName: \"kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.544181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph\") pod \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\" (UID: \"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3\") " Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.549330 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b" (OuterVolumeSpecName: "kube-api-access-rvz9b") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "kube-api-access-rvz9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.549489 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.559945 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph" (OuterVolumeSpecName: "ceph") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.573655 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.574835 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.577014 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory" (OuterVolumeSpecName: "inventory") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.582939 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" (UID: "e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647795 4764 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647830 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647845 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvz9b\" (UniqueName: \"kubernetes.io/projected/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-kube-api-access-rvz9b\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647859 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647871 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647885 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.647898 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.815836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" event={"ID":"e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3","Type":"ContainerDied","Data":"4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e"} Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.815884 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e97e7f4ca9de1e5d7586b25ce2fce05029b93f5bff9eba6b0100b3748708b6e" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.815920 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-qx7bp" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.923813 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9fc77"] Dec 04 01:49:42 crc kubenswrapper[4764]: E1204 01:49:42.925220 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" containerName="neutron-metadata-openstack-openstack-cell1" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.925247 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" containerName="neutron-metadata-openstack-openstack-cell1" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.925524 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3" containerName="neutron-metadata-openstack-openstack-cell1" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.926323 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.930357 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.930673 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.930902 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.930968 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.931332 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:49:42 crc kubenswrapper[4764]: I1204 01:49:42.937116 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9fc77"] Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.056583 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.056882 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.056946 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.057103 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.057181 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.057252 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwn92\" (UniqueName: \"kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159000 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159164 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159195 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159271 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159335 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.159401 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwn92\" (UniqueName: \"kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.164602 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.164754 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.167361 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.168097 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.168257 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.177576 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwn92\" (UniqueName: \"kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92\") pod \"libvirt-openstack-openstack-cell1-9fc77\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.247661 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:49:43 crc kubenswrapper[4764]: I1204 01:49:43.940976 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-9fc77"] Dec 04 01:49:44 crc kubenswrapper[4764]: I1204 01:49:44.841882 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" event={"ID":"e419a2a6-f35a-4620-9014-eaefccaf150e","Type":"ContainerStarted","Data":"24c822293391e0108e65dd90621e7bcf4cff4b7fa20939654e72d2e006098a77"} Dec 04 01:49:44 crc kubenswrapper[4764]: I1204 01:49:44.842324 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" event={"ID":"e419a2a6-f35a-4620-9014-eaefccaf150e","Type":"ContainerStarted","Data":"77b60d3ec768583fc78e0efa04fe17923e549993680a326fff848caf95946738"} Dec 04 01:49:44 crc kubenswrapper[4764]: I1204 01:49:44.869165 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" podStartSLOduration=2.460402484 podStartE2EDuration="2.869150918s" podCreationTimestamp="2025-12-04 01:49:42 +0000 UTC" firstStartedPulling="2025-12-04 01:49:43.943359902 +0000 UTC m=+7719.704684323" lastFinishedPulling="2025-12-04 01:49:44.352108306 +0000 UTC m=+7720.113432757" observedRunningTime="2025-12-04 01:49:44.86352937 +0000 UTC m=+7720.624853781" watchObservedRunningTime="2025-12-04 01:49:44.869150918 +0000 UTC m=+7720.630475329" Dec 04 01:49:54 crc kubenswrapper[4764]: I1204 01:49:54.737147 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:49:54 crc kubenswrapper[4764]: E1204 01:49:54.745313 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:50:06 crc kubenswrapper[4764]: I1204 01:50:06.545874 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:50:06 crc kubenswrapper[4764]: E1204 01:50:06.546868 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.512167 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.517945 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.523601 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.658989 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.659122 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.659233 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jn6\" (UniqueName: \"kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.762917 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.763042 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.763123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jn6\" (UniqueName: \"kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.763907 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.764307 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.787047 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jn6\" (UniqueName: \"kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6\") pod \"certified-operators-9b6s2\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:12 crc kubenswrapper[4764]: I1204 01:50:12.864615 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:13 crc kubenswrapper[4764]: I1204 01:50:13.423678 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:13 crc kubenswrapper[4764]: W1204 01:50:13.429291 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab56b42a_3541_4771_bf8b_cb1af0dcb015.slice/crio-d0cacb953b2233cad09a17cfd22d6e737383488f35f6a764c69a3db280ba41fe WatchSource:0}: Error finding container d0cacb953b2233cad09a17cfd22d6e737383488f35f6a764c69a3db280ba41fe: Status 404 returned error can't find the container with id d0cacb953b2233cad09a17cfd22d6e737383488f35f6a764c69a3db280ba41fe Dec 04 01:50:14 crc kubenswrapper[4764]: I1204 01:50:14.010042 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerID="17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2" exitCode=0 Dec 04 01:50:14 crc kubenswrapper[4764]: I1204 01:50:14.010146 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerDied","Data":"17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2"} Dec 04 01:50:14 crc kubenswrapper[4764]: I1204 01:50:14.010378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerStarted","Data":"d0cacb953b2233cad09a17cfd22d6e737383488f35f6a764c69a3db280ba41fe"} Dec 04 01:50:15 crc kubenswrapper[4764]: I1204 01:50:15.029323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerStarted","Data":"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9"} Dec 04 01:50:16 crc kubenswrapper[4764]: I1204 01:50:16.045223 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerID="3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9" exitCode=0 Dec 04 01:50:16 crc kubenswrapper[4764]: I1204 01:50:16.045285 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerDied","Data":"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9"} Dec 04 01:50:17 crc kubenswrapper[4764]: I1204 01:50:17.094785 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerStarted","Data":"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688"} Dec 04 01:50:20 crc kubenswrapper[4764]: I1204 01:50:20.546648 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:50:20 crc kubenswrapper[4764]: E1204 01:50:20.547855 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:50:22 crc kubenswrapper[4764]: I1204 01:50:22.865482 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:22 crc kubenswrapper[4764]: I1204 01:50:22.865913 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:22 crc kubenswrapper[4764]: I1204 01:50:22.945500 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:22 crc kubenswrapper[4764]: I1204 01:50:22.980847 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9b6s2" podStartSLOduration=8.543337701 podStartE2EDuration="10.980827018s" podCreationTimestamp="2025-12-04 01:50:12 +0000 UTC" firstStartedPulling="2025-12-04 01:50:14.013262416 +0000 UTC m=+7749.774586827" lastFinishedPulling="2025-12-04 01:50:16.450751733 +0000 UTC m=+7752.212076144" observedRunningTime="2025-12-04 01:50:17.113996874 +0000 UTC m=+7752.875321285" watchObservedRunningTime="2025-12-04 01:50:22.980827018 +0000 UTC m=+7758.742151429" Dec 04 01:50:23 crc kubenswrapper[4764]: I1204 01:50:23.236073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:23 crc kubenswrapper[4764]: I1204 01:50:23.306365 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:25 crc kubenswrapper[4764]: I1204 01:50:25.207181 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9b6s2" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="registry-server" containerID="cri-o://442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688" gracePeriod=2 Dec 04 01:50:25 crc kubenswrapper[4764]: I1204 01:50:25.885305 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.000116 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content\") pod \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.000166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4jn6\" (UniqueName: \"kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6\") pod \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.000286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities\") pod \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\" (UID: \"ab56b42a-3541-4771-bf8b-cb1af0dcb015\") " Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.002194 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities" (OuterVolumeSpecName: "utilities") pod "ab56b42a-3541-4771-bf8b-cb1af0dcb015" (UID: "ab56b42a-3541-4771-bf8b-cb1af0dcb015"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.006236 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6" (OuterVolumeSpecName: "kube-api-access-v4jn6") pod "ab56b42a-3541-4771-bf8b-cb1af0dcb015" (UID: "ab56b42a-3541-4771-bf8b-cb1af0dcb015"). InnerVolumeSpecName "kube-api-access-v4jn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.065321 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab56b42a-3541-4771-bf8b-cb1af0dcb015" (UID: "ab56b42a-3541-4771-bf8b-cb1af0dcb015"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.103808 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.103895 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4jn6\" (UniqueName: \"kubernetes.io/projected/ab56b42a-3541-4771-bf8b-cb1af0dcb015-kube-api-access-v4jn6\") on node \"crc\" DevicePath \"\"" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.104168 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab56b42a-3541-4771-bf8b-cb1af0dcb015-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.221813 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerID="442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688" exitCode=0 Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.221865 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerDied","Data":"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688"} Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.221900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9b6s2" event={"ID":"ab56b42a-3541-4771-bf8b-cb1af0dcb015","Type":"ContainerDied","Data":"d0cacb953b2233cad09a17cfd22d6e737383488f35f6a764c69a3db280ba41fe"} Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.221932 4764 scope.go:117] "RemoveContainer" containerID="442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.222860 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9b6s2" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.269559 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.269941 4764 scope.go:117] "RemoveContainer" containerID="3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.282003 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9b6s2"] Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.300361 4764 scope.go:117] "RemoveContainer" containerID="17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.346122 4764 scope.go:117] "RemoveContainer" containerID="442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688" Dec 04 01:50:26 crc kubenswrapper[4764]: E1204 01:50:26.347042 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688\": container with ID starting with 442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688 not found: ID does not exist" containerID="442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.347094 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688"} err="failed to get container status \"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688\": rpc error: code = NotFound desc = could not find container \"442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688\": container with ID starting with 442a3bc50e089d9baceecf2c6e221b255e3a5ae76a9bb64a276943f57f8f8688 not found: ID does not exist" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.347127 4764 scope.go:117] "RemoveContainer" containerID="3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9" Dec 04 01:50:26 crc kubenswrapper[4764]: E1204 01:50:26.347700 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9\": container with ID starting with 3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9 not found: ID does not exist" containerID="3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.347799 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9"} err="failed to get container status \"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9\": rpc error: code = NotFound desc = could not find container \"3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9\": container with ID starting with 3f6d0f4105191944ae4211be62c57eff622035cea6e8224c79c0034c7a75a3d9 not found: ID does not exist" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.347832 4764 scope.go:117] "RemoveContainer" containerID="17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2" Dec 04 01:50:26 crc kubenswrapper[4764]: E1204 01:50:26.349957 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2\": container with ID starting with 17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2 not found: ID does not exist" containerID="17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.350000 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2"} err="failed to get container status \"17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2\": rpc error: code = NotFound desc = could not find container \"17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2\": container with ID starting with 17eb3770dcaf61983a72fb9b7594fa81e6e3b00376b0be407cd36652c0aa97d2 not found: ID does not exist" Dec 04 01:50:26 crc kubenswrapper[4764]: I1204 01:50:26.560934 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" path="/var/lib/kubelet/pods/ab56b42a-3541-4771-bf8b-cb1af0dcb015/volumes" Dec 04 01:50:35 crc kubenswrapper[4764]: I1204 01:50:35.546641 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:50:36 crc kubenswrapper[4764]: I1204 01:50:36.360228 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a"} Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.286325 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:50:58 crc kubenswrapper[4764]: E1204 01:50:58.287424 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="extract-content" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.287443 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="extract-content" Dec 04 01:50:58 crc kubenswrapper[4764]: E1204 01:50:58.287488 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="registry-server" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.287497 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="registry-server" Dec 04 01:50:58 crc kubenswrapper[4764]: E1204 01:50:58.287548 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="extract-utilities" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.287559 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="extract-utilities" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.288149 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab56b42a-3541-4771-bf8b-cb1af0dcb015" containerName="registry-server" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.290193 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.312148 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.322045 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwlgt\" (UniqueName: \"kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.324148 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.324223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.426116 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwlgt\" (UniqueName: \"kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.426485 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.426526 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.427151 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.427166 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.454577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwlgt\" (UniqueName: \"kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt\") pod \"community-operators-fbwzp\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:58 crc kubenswrapper[4764]: I1204 01:50:58.630284 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:50:59 crc kubenswrapper[4764]: I1204 01:50:59.198251 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:51:00 crc kubenswrapper[4764]: I1204 01:51:00.007155 4764 generic.go:334] "Generic (PLEG): container finished" podID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerID="e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504" exitCode=0 Dec 04 01:51:00 crc kubenswrapper[4764]: I1204 01:51:00.007236 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerDied","Data":"e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504"} Dec 04 01:51:00 crc kubenswrapper[4764]: I1204 01:51:00.007825 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerStarted","Data":"f97ef7b66453bb8da649e0137b4d8be4a0eb712aee840dfad4fd2f98e3af1cd4"} Dec 04 01:51:01 crc kubenswrapper[4764]: I1204 01:51:01.024795 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerStarted","Data":"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2"} Dec 04 01:51:02 crc kubenswrapper[4764]: I1204 01:51:02.041440 4764 generic.go:334] "Generic (PLEG): container finished" podID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerID="18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2" exitCode=0 Dec 04 01:51:02 crc kubenswrapper[4764]: I1204 01:51:02.041483 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerDied","Data":"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2"} Dec 04 01:51:03 crc kubenswrapper[4764]: I1204 01:51:03.052213 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerStarted","Data":"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c"} Dec 04 01:51:03 crc kubenswrapper[4764]: I1204 01:51:03.074587 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fbwzp" podStartSLOduration=2.649149761 podStartE2EDuration="5.074571333s" podCreationTimestamp="2025-12-04 01:50:58 +0000 UTC" firstStartedPulling="2025-12-04 01:51:00.010131634 +0000 UTC m=+7795.771456055" lastFinishedPulling="2025-12-04 01:51:02.435553176 +0000 UTC m=+7798.196877627" observedRunningTime="2025-12-04 01:51:03.069611721 +0000 UTC m=+7798.830936132" watchObservedRunningTime="2025-12-04 01:51:03.074571333 +0000 UTC m=+7798.835895744" Dec 04 01:51:08 crc kubenswrapper[4764]: I1204 01:51:08.630439 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:08 crc kubenswrapper[4764]: I1204 01:51:08.631036 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:08 crc kubenswrapper[4764]: I1204 01:51:08.724379 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:09 crc kubenswrapper[4764]: I1204 01:51:09.222702 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:09 crc kubenswrapper[4764]: I1204 01:51:09.282828 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.169890 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fbwzp" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="registry-server" containerID="cri-o://7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c" gracePeriod=2 Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.709997 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.846462 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities\") pod \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.846833 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content\") pod \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.846911 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwlgt\" (UniqueName: \"kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt\") pod \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\" (UID: \"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a\") " Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.847498 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities" (OuterVolumeSpecName: "utilities") pod "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" (UID: "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.852025 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt" (OuterVolumeSpecName: "kube-api-access-nwlgt") pod "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" (UID: "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a"). InnerVolumeSpecName "kube-api-access-nwlgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.917231 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" (UID: "50a8f0f4-45f3-4fc2-a99f-f215f136ad2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.949075 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.949106 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwlgt\" (UniqueName: \"kubernetes.io/projected/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-kube-api-access-nwlgt\") on node \"crc\" DevicePath \"\"" Dec 04 01:51:11 crc kubenswrapper[4764]: I1204 01:51:11.949116 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.181499 4764 generic.go:334] "Generic (PLEG): container finished" podID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerID="7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c" exitCode=0 Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.181548 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerDied","Data":"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c"} Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.181585 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fbwzp" event={"ID":"50a8f0f4-45f3-4fc2-a99f-f215f136ad2a","Type":"ContainerDied","Data":"f97ef7b66453bb8da649e0137b4d8be4a0eb712aee840dfad4fd2f98e3af1cd4"} Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.181605 4764 scope.go:117] "RemoveContainer" containerID="7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.181608 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fbwzp" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.215847 4764 scope.go:117] "RemoveContainer" containerID="18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.243093 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.259159 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fbwzp"] Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.267404 4764 scope.go:117] "RemoveContainer" containerID="e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.332820 4764 scope.go:117] "RemoveContainer" containerID="7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c" Dec 04 01:51:12 crc kubenswrapper[4764]: E1204 01:51:12.333316 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c\": container with ID starting with 7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c not found: ID does not exist" containerID="7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.333354 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c"} err="failed to get container status \"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c\": rpc error: code = NotFound desc = could not find container \"7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c\": container with ID starting with 7741478ebe6c7139a731cde46aceb888942fb6b1c9b7d1ef74a2fe239a2c1d4c not found: ID does not exist" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.333379 4764 scope.go:117] "RemoveContainer" containerID="18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2" Dec 04 01:51:12 crc kubenswrapper[4764]: E1204 01:51:12.340086 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2\": container with ID starting with 18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2 not found: ID does not exist" containerID="18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.340114 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2"} err="failed to get container status \"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2\": rpc error: code = NotFound desc = could not find container \"18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2\": container with ID starting with 18a7af5969938057fd2432fdbc7a80531ea21f517ca6ad84a0cf746181db24e2 not found: ID does not exist" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.340132 4764 scope.go:117] "RemoveContainer" containerID="e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504" Dec 04 01:51:12 crc kubenswrapper[4764]: E1204 01:51:12.340735 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504\": container with ID starting with e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504 not found: ID does not exist" containerID="e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.340780 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504"} err="failed to get container status \"e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504\": rpc error: code = NotFound desc = could not find container \"e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504\": container with ID starting with e6e9bb83bb7ceda538ed3ac2ace2fbb157586206f2233467a41f33cf5abaf504 not found: ID does not exist" Dec 04 01:51:12 crc kubenswrapper[4764]: I1204 01:51:12.560460 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" path="/var/lib/kubelet/pods/50a8f0f4-45f3-4fc2-a99f-f215f136ad2a/volumes" Dec 04 01:52:50 crc kubenswrapper[4764]: I1204 01:52:50.868875 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:52:50 crc kubenswrapper[4764]: I1204 01:52:50.870051 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:53:20 crc kubenswrapper[4764]: I1204 01:53:20.868523 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:53:20 crc kubenswrapper[4764]: I1204 01:53:20.869146 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:53:50 crc kubenswrapper[4764]: I1204 01:53:50.868821 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:53:50 crc kubenswrapper[4764]: I1204 01:53:50.869602 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:53:50 crc kubenswrapper[4764]: I1204 01:53:50.869698 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:53:50 crc kubenswrapper[4764]: I1204 01:53:50.871078 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:53:50 crc kubenswrapper[4764]: I1204 01:53:50.871196 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a" gracePeriod=600 Dec 04 01:53:51 crc kubenswrapper[4764]: I1204 01:53:51.203061 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a" exitCode=0 Dec 04 01:53:51 crc kubenswrapper[4764]: I1204 01:53:51.203119 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a"} Dec 04 01:53:51 crc kubenswrapper[4764]: I1204 01:53:51.203381 4764 scope.go:117] "RemoveContainer" containerID="8bfe0f61f26581f9602395e34f38303ac2071e63a66a39299bb5c2335172f73c" Dec 04 01:53:52 crc kubenswrapper[4764]: I1204 01:53:52.217963 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da"} Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.123192 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:00 crc kubenswrapper[4764]: E1204 01:54:00.124245 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="registry-server" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.124262 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="registry-server" Dec 04 01:54:00 crc kubenswrapper[4764]: E1204 01:54:00.124276 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="extract-utilities" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.124283 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="extract-utilities" Dec 04 01:54:00 crc kubenswrapper[4764]: E1204 01:54:00.124303 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="extract-content" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.124311 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="extract-content" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.124622 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a8f0f4-45f3-4fc2-a99f-f215f136ad2a" containerName="registry-server" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.127026 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.137617 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.197697 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.198146 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.198217 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjtj8\" (UniqueName: \"kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.300201 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.300284 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjtj8\" (UniqueName: \"kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.300378 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.300891 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.301113 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.329565 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjtj8\" (UniqueName: \"kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8\") pod \"redhat-marketplace-5fmm8\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.457785 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:00 crc kubenswrapper[4764]: I1204 01:54:00.968150 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:01 crc kubenswrapper[4764]: I1204 01:54:01.331753 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerID="f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7" exitCode=0 Dec 04 01:54:01 crc kubenswrapper[4764]: I1204 01:54:01.331962 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerDied","Data":"f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7"} Dec 04 01:54:01 crc kubenswrapper[4764]: I1204 01:54:01.332073 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerStarted","Data":"a4c02bdd54daf367edf4961e4989d3370d746e5f6ecb5b948a69e6fcc8f0a6eb"} Dec 04 01:54:01 crc kubenswrapper[4764]: I1204 01:54:01.335332 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:54:03 crc kubenswrapper[4764]: I1204 01:54:03.358202 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerID="3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767" exitCode=0 Dec 04 01:54:03 crc kubenswrapper[4764]: I1204 01:54:03.358303 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerDied","Data":"3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767"} Dec 04 01:54:04 crc kubenswrapper[4764]: I1204 01:54:04.375773 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerStarted","Data":"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65"} Dec 04 01:54:04 crc kubenswrapper[4764]: I1204 01:54:04.415340 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5fmm8" podStartSLOduration=1.687638453 podStartE2EDuration="4.415313813s" podCreationTimestamp="2025-12-04 01:54:00 +0000 UTC" firstStartedPulling="2025-12-04 01:54:01.334922224 +0000 UTC m=+7977.096246635" lastFinishedPulling="2025-12-04 01:54:04.062597584 +0000 UTC m=+7979.823921995" observedRunningTime="2025-12-04 01:54:04.403141114 +0000 UTC m=+7980.164465535" watchObservedRunningTime="2025-12-04 01:54:04.415313813 +0000 UTC m=+7980.176638234" Dec 04 01:54:10 crc kubenswrapper[4764]: I1204 01:54:10.458352 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:10 crc kubenswrapper[4764]: I1204 01:54:10.459078 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:10 crc kubenswrapper[4764]: I1204 01:54:10.519648 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:11 crc kubenswrapper[4764]: I1204 01:54:11.541540 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:11 crc kubenswrapper[4764]: I1204 01:54:11.592195 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:13 crc kubenswrapper[4764]: I1204 01:54:13.503563 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5fmm8" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="registry-server" containerID="cri-o://a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65" gracePeriod=2 Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.087532 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.140697 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content\") pod \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.140905 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities\") pod \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.141010 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjtj8\" (UniqueName: \"kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8\") pod \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\" (UID: \"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2\") " Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.141999 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities" (OuterVolumeSpecName: "utilities") pod "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" (UID: "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.175976 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8" (OuterVolumeSpecName: "kube-api-access-jjtj8") pod "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" (UID: "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2"). InnerVolumeSpecName "kube-api-access-jjtj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.182514 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" (UID: "6ef1bbb1-6805-48d2-9c82-f30dc7e058d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.242922 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.242954 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.242965 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjtj8\" (UniqueName: \"kubernetes.io/projected/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2-kube-api-access-jjtj8\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.520982 4764 generic.go:334] "Generic (PLEG): container finished" podID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerID="a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65" exitCode=0 Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.521109 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerDied","Data":"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65"} Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.521378 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5fmm8" event={"ID":"6ef1bbb1-6805-48d2-9c82-f30dc7e058d2","Type":"ContainerDied","Data":"a4c02bdd54daf367edf4961e4989d3370d746e5f6ecb5b948a69e6fcc8f0a6eb"} Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.521414 4764 scope.go:117] "RemoveContainer" containerID="a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.521137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5fmm8" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.562446 4764 scope.go:117] "RemoveContainer" containerID="3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.577275 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.586638 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5fmm8"] Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.609606 4764 scope.go:117] "RemoveContainer" containerID="f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.642012 4764 scope.go:117] "RemoveContainer" containerID="a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65" Dec 04 01:54:14 crc kubenswrapper[4764]: E1204 01:54:14.642673 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65\": container with ID starting with a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65 not found: ID does not exist" containerID="a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.642773 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65"} err="failed to get container status \"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65\": rpc error: code = NotFound desc = could not find container \"a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65\": container with ID starting with a3aae0c9825288458f9afb9bc3aaeddbbcf93e16579e88bc4cee074e16035a65 not found: ID does not exist" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.642822 4764 scope.go:117] "RemoveContainer" containerID="3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767" Dec 04 01:54:14 crc kubenswrapper[4764]: E1204 01:54:14.643165 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767\": container with ID starting with 3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767 not found: ID does not exist" containerID="3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.643198 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767"} err="failed to get container status \"3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767\": rpc error: code = NotFound desc = could not find container \"3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767\": container with ID starting with 3eab528990a9ee3e4f32e72b12669698ccc2fc25e8d2f1c8eb49508b64b97767 not found: ID does not exist" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.643220 4764 scope.go:117] "RemoveContainer" containerID="f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7" Dec 04 01:54:14 crc kubenswrapper[4764]: E1204 01:54:14.643484 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7\": container with ID starting with f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7 not found: ID does not exist" containerID="f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7" Dec 04 01:54:14 crc kubenswrapper[4764]: I1204 01:54:14.643572 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7"} err="failed to get container status \"f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7\": rpc error: code = NotFound desc = could not find container \"f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7\": container with ID starting with f26b65d461cd2650cd037a0afd70d9c1c4c519c1d8e3cfc20d80203fb8a249c7 not found: ID does not exist" Dec 04 01:54:16 crc kubenswrapper[4764]: I1204 01:54:16.562369 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" path="/var/lib/kubelet/pods/6ef1bbb1-6805-48d2-9c82-f30dc7e058d2/volumes" Dec 04 01:54:30 crc kubenswrapper[4764]: I1204 01:54:30.708936 4764 generic.go:334] "Generic (PLEG): container finished" podID="e419a2a6-f35a-4620-9014-eaefccaf150e" containerID="24c822293391e0108e65dd90621e7bcf4cff4b7fa20939654e72d2e006098a77" exitCode=0 Dec 04 01:54:30 crc kubenswrapper[4764]: I1204 01:54:30.709034 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" event={"ID":"e419a2a6-f35a-4620-9014-eaefccaf150e","Type":"ContainerDied","Data":"24c822293391e0108e65dd90621e7bcf4cff4b7fa20939654e72d2e006098a77"} Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.259524 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389132 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389207 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389291 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389462 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwn92\" (UniqueName: \"kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.389670 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory\") pod \"e419a2a6-f35a-4620-9014-eaefccaf150e\" (UID: \"e419a2a6-f35a-4620-9014-eaefccaf150e\") " Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.410618 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92" (OuterVolumeSpecName: "kube-api-access-wwn92") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "kube-api-access-wwn92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.422062 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.424959 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph" (OuterVolumeSpecName: "ceph") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.425298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.425894 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory" (OuterVolumeSpecName: "inventory") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.451492 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e419a2a6-f35a-4620-9014-eaefccaf150e" (UID: "e419a2a6-f35a-4620-9014-eaefccaf150e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493140 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493184 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493199 4764 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493211 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwn92\" (UniqueName: \"kubernetes.io/projected/e419a2a6-f35a-4620-9014-eaefccaf150e-kube-api-access-wwn92\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493223 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.493233 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e419a2a6-f35a-4620-9014-eaefccaf150e-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.735273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" event={"ID":"e419a2a6-f35a-4620-9014-eaefccaf150e","Type":"ContainerDied","Data":"77b60d3ec768583fc78e0efa04fe17923e549993680a326fff848caf95946738"} Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.735337 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b60d3ec768583fc78e0efa04fe17923e549993680a326fff848caf95946738" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.735424 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-9fc77" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839060 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbc64"] Dec 04 01:54:32 crc kubenswrapper[4764]: E1204 01:54:32.839447 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e419a2a6-f35a-4620-9014-eaefccaf150e" containerName="libvirt-openstack-openstack-cell1" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839461 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e419a2a6-f35a-4620-9014-eaefccaf150e" containerName="libvirt-openstack-openstack-cell1" Dec 04 01:54:32 crc kubenswrapper[4764]: E1204 01:54:32.839510 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="extract-content" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839531 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="extract-content" Dec 04 01:54:32 crc kubenswrapper[4764]: E1204 01:54:32.839538 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="registry-server" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839545 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="registry-server" Dec 04 01:54:32 crc kubenswrapper[4764]: E1204 01:54:32.839558 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="extract-utilities" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839565 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="extract-utilities" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839771 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef1bbb1-6805-48d2-9c82-f30dc7e058d2" containerName="registry-server" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.839792 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e419a2a6-f35a-4620-9014-eaefccaf150e" containerName="libvirt-openstack-openstack-cell1" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.840496 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.843075 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.843344 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.843362 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.845633 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.845633 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.846124 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.846144 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.883780 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbc64"] Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.903797 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.903856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.903960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904036 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfm95\" (UniqueName: \"kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904090 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904188 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904260 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904301 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904518 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904661 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:32 crc kubenswrapper[4764]: I1204 01:54:32.904693 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.007629 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.007793 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.007835 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.007957 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.007999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008057 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008103 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfm95\" (UniqueName: \"kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008153 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008193 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008225 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.008251 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.010418 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.010617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.014861 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.014928 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.015039 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.015433 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.015989 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.019746 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.024503 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.024667 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.030261 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfm95\" (UniqueName: \"kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95\") pod \"nova-cell1-openstack-openstack-cell1-rbc64\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.180127 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:54:33 crc kubenswrapper[4764]: I1204 01:54:33.864409 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-rbc64"] Dec 04 01:54:33 crc kubenswrapper[4764]: W1204 01:54:33.873865 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod079d074f_7a44_4c65_98ca_68e216036454.slice/crio-f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57 WatchSource:0}: Error finding container f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57: Status 404 returned error can't find the container with id f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57 Dec 04 01:54:34 crc kubenswrapper[4764]: I1204 01:54:34.763208 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" event={"ID":"079d074f-7a44-4c65-98ca-68e216036454","Type":"ContainerStarted","Data":"e948f8ff5b14b8894e4df1d50cc563c56c771011887752a52ac75d642c5a78a1"} Dec 04 01:54:34 crc kubenswrapper[4764]: I1204 01:54:34.763898 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" event={"ID":"079d074f-7a44-4c65-98ca-68e216036454","Type":"ContainerStarted","Data":"f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57"} Dec 04 01:54:34 crc kubenswrapper[4764]: I1204 01:54:34.792156 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" podStartSLOduration=2.241495282 podStartE2EDuration="2.79213626s" podCreationTimestamp="2025-12-04 01:54:32 +0000 UTC" firstStartedPulling="2025-12-04 01:54:33.87799165 +0000 UTC m=+8009.639316071" lastFinishedPulling="2025-12-04 01:54:34.428632638 +0000 UTC m=+8010.189957049" observedRunningTime="2025-12-04 01:54:34.790435629 +0000 UTC m=+8010.551760040" watchObservedRunningTime="2025-12-04 01:54:34.79213626 +0000 UTC m=+8010.553460681" Dec 04 01:56:20 crc kubenswrapper[4764]: I1204 01:56:20.868894 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:56:20 crc kubenswrapper[4764]: I1204 01:56:20.869577 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:56:50 crc kubenswrapper[4764]: I1204 01:56:50.868570 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:56:50 crc kubenswrapper[4764]: I1204 01:56:50.869200 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:57:20 crc kubenswrapper[4764]: I1204 01:57:20.869179 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 01:57:20 crc kubenswrapper[4764]: I1204 01:57:20.869742 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 01:57:20 crc kubenswrapper[4764]: I1204 01:57:20.869799 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 01:57:20 crc kubenswrapper[4764]: I1204 01:57:20.870999 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 01:57:20 crc kubenswrapper[4764]: I1204 01:57:20.871089 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" gracePeriod=600 Dec 04 01:57:21 crc kubenswrapper[4764]: E1204 01:57:21.006044 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:57:21 crc kubenswrapper[4764]: I1204 01:57:21.707999 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" exitCode=0 Dec 04 01:57:21 crc kubenswrapper[4764]: I1204 01:57:21.708085 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da"} Dec 04 01:57:21 crc kubenswrapper[4764]: I1204 01:57:21.708328 4764 scope.go:117] "RemoveContainer" containerID="72961a01b8dd1221f6447e3f2e3a0f15bcf0432f34dd56f1d75da7ab07484f0a" Dec 04 01:57:21 crc kubenswrapper[4764]: I1204 01:57:21.709114 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:57:21 crc kubenswrapper[4764]: E1204 01:57:21.709486 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:57:33 crc kubenswrapper[4764]: I1204 01:57:33.546703 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:57:33 crc kubenswrapper[4764]: E1204 01:57:33.547767 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:57:44 crc kubenswrapper[4764]: I1204 01:57:44.555758 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:57:44 crc kubenswrapper[4764]: E1204 01:57:44.558414 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:57:47 crc kubenswrapper[4764]: I1204 01:57:47.990553 4764 generic.go:334] "Generic (PLEG): container finished" podID="079d074f-7a44-4c65-98ca-68e216036454" containerID="e948f8ff5b14b8894e4df1d50cc563c56c771011887752a52ac75d642c5a78a1" exitCode=0 Dec 04 01:57:47 crc kubenswrapper[4764]: I1204 01:57:47.990669 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" event={"ID":"079d074f-7a44-4c65-98ca-68e216036454","Type":"ContainerDied","Data":"e948f8ff5b14b8894e4df1d50cc563c56c771011887752a52ac75d642c5a78a1"} Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.509821 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.655842 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656376 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656428 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656472 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656612 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfm95\" (UniqueName: \"kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656761 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656839 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656885 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.656973 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.657034 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.657076 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0\") pod \"079d074f-7a44-4c65-98ca-68e216036454\" (UID: \"079d074f-7a44-4c65-98ca-68e216036454\") " Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.677384 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph" (OuterVolumeSpecName: "ceph") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.679063 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.679495 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95" (OuterVolumeSpecName: "kube-api-access-jfm95") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "kube-api-access-jfm95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.689002 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory" (OuterVolumeSpecName: "inventory") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.691177 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.693037 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.696605 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.703041 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.704208 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.724925 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.728455 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "079d074f-7a44-4c65-98ca-68e216036454" (UID: "079d074f-7a44-4c65-98ca-68e216036454"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760189 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760219 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760233 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760245 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760260 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760274 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760285 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/079d074f-7a44-4c65-98ca-68e216036454-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760296 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760322 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760334 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfm95\" (UniqueName: \"kubernetes.io/projected/079d074f-7a44-4c65-98ca-68e216036454-kube-api-access-jfm95\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:49 crc kubenswrapper[4764]: I1204 01:57:49.760347 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/079d074f-7a44-4c65-98ca-68e216036454-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.021382 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" event={"ID":"079d074f-7a44-4c65-98ca-68e216036454","Type":"ContainerDied","Data":"f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57"} Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.021428 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f42dd4962ac25c0c6caf96e12f672fa8a46575321bbfd7f837e9492fdc61dd57" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.021492 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-rbc64" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.119351 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-t7qqm"] Dec 04 01:57:50 crc kubenswrapper[4764]: E1204 01:57:50.120133 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="079d074f-7a44-4c65-98ca-68e216036454" containerName="nova-cell1-openstack-openstack-cell1" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.120156 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="079d074f-7a44-4c65-98ca-68e216036454" containerName="nova-cell1-openstack-openstack-cell1" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.120432 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="079d074f-7a44-4c65-98ca-68e216036454" containerName="nova-cell1-openstack-openstack-cell1" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.121576 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.123829 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.123884 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.123833 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.124003 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.124365 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.134968 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-t7qqm"] Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269329 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269383 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269423 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269548 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75wt\" (UniqueName: \"kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269571 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269594 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.269612 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371709 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371840 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75wt\" (UniqueName: \"kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371914 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371939 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.371955 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.372017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.372044 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.376495 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.376825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.377762 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.378330 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.379078 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.380158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.380273 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.389875 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75wt\" (UniqueName: \"kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt\") pod \"telemetry-openstack-openstack-cell1-t7qqm\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:50 crc kubenswrapper[4764]: I1204 01:57:50.437908 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 01:57:51 crc kubenswrapper[4764]: I1204 01:57:51.051034 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-t7qqm"] Dec 04 01:57:52 crc kubenswrapper[4764]: I1204 01:57:52.040166 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" event={"ID":"3e94945c-c71a-4fd7-95d1-8609b7bc068e","Type":"ContainerStarted","Data":"4f858fdef975ed85451d9aee6dae8d09e5601c27335852005b2e1193d6618e15"} Dec 04 01:57:52 crc kubenswrapper[4764]: I1204 01:57:52.040770 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" event={"ID":"3e94945c-c71a-4fd7-95d1-8609b7bc068e","Type":"ContainerStarted","Data":"a75d5f8f4ac5b3fc3d0d7300433e8ee79b198b5edd62606251ef535552a4f4dc"} Dec 04 01:57:52 crc kubenswrapper[4764]: I1204 01:57:52.062849 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" podStartSLOduration=1.6798512570000002 podStartE2EDuration="2.062823429s" podCreationTimestamp="2025-12-04 01:57:50 +0000 UTC" firstStartedPulling="2025-12-04 01:57:51.055654365 +0000 UTC m=+8206.816978776" lastFinishedPulling="2025-12-04 01:57:51.438626537 +0000 UTC m=+8207.199950948" observedRunningTime="2025-12-04 01:57:52.058256897 +0000 UTC m=+8207.819581308" watchObservedRunningTime="2025-12-04 01:57:52.062823429 +0000 UTC m=+8207.824147840" Dec 04 01:57:56 crc kubenswrapper[4764]: I1204 01:57:56.546565 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:57:56 crc kubenswrapper[4764]: E1204 01:57:56.547622 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:58:07 crc kubenswrapper[4764]: I1204 01:58:07.545949 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:58:07 crc kubenswrapper[4764]: E1204 01:58:07.546910 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:58:22 crc kubenswrapper[4764]: I1204 01:58:22.549522 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:58:22 crc kubenswrapper[4764]: E1204 01:58:22.550222 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:58:35 crc kubenswrapper[4764]: I1204 01:58:35.545838 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:58:35 crc kubenswrapper[4764]: E1204 01:58:35.546866 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:58:46 crc kubenswrapper[4764]: I1204 01:58:46.546567 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:58:46 crc kubenswrapper[4764]: E1204 01:58:46.547664 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:58:58 crc kubenswrapper[4764]: I1204 01:58:58.546795 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:58:58 crc kubenswrapper[4764]: E1204 01:58:58.548229 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.714332 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.717424 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.757855 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.818386 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.818527 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.818567 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5kl2\" (UniqueName: \"kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.920303 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.920362 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5kl2\" (UniqueName: \"kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.920484 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.921073 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.921100 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:03 crc kubenswrapper[4764]: I1204 01:59:03.947814 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5kl2\" (UniqueName: \"kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2\") pod \"redhat-operators-w6p7n\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:04 crc kubenswrapper[4764]: I1204 01:59:04.059732 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:04 crc kubenswrapper[4764]: I1204 01:59:04.578513 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:05 crc kubenswrapper[4764]: I1204 01:59:05.094124 4764 generic.go:334] "Generic (PLEG): container finished" podID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerID="7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2" exitCode=0 Dec 04 01:59:05 crc kubenswrapper[4764]: I1204 01:59:05.094179 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerDied","Data":"7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2"} Dec 04 01:59:05 crc kubenswrapper[4764]: I1204 01:59:05.094397 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerStarted","Data":"0b80b4a200c71f48d8b6adbe31041660119ae2306512f333e2efcdd685822b24"} Dec 04 01:59:05 crc kubenswrapper[4764]: I1204 01:59:05.096497 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 01:59:06 crc kubenswrapper[4764]: I1204 01:59:06.108783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerStarted","Data":"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba"} Dec 04 01:59:09 crc kubenswrapper[4764]: I1204 01:59:09.142289 4764 generic.go:334] "Generic (PLEG): container finished" podID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerID="d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba" exitCode=0 Dec 04 01:59:09 crc kubenswrapper[4764]: I1204 01:59:09.142900 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerDied","Data":"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba"} Dec 04 01:59:10 crc kubenswrapper[4764]: I1204 01:59:10.159995 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerStarted","Data":"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7"} Dec 04 01:59:10 crc kubenswrapper[4764]: I1204 01:59:10.185980 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w6p7n" podStartSLOduration=2.7402641660000002 podStartE2EDuration="7.185920401s" podCreationTimestamp="2025-12-04 01:59:03 +0000 UTC" firstStartedPulling="2025-12-04 01:59:05.096257227 +0000 UTC m=+8280.857581638" lastFinishedPulling="2025-12-04 01:59:09.541913432 +0000 UTC m=+8285.303237873" observedRunningTime="2025-12-04 01:59:10.180102179 +0000 UTC m=+8285.941426600" watchObservedRunningTime="2025-12-04 01:59:10.185920401 +0000 UTC m=+8285.947244832" Dec 04 01:59:12 crc kubenswrapper[4764]: I1204 01:59:12.546849 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:59:12 crc kubenswrapper[4764]: E1204 01:59:12.547758 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:59:14 crc kubenswrapper[4764]: I1204 01:59:14.060765 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:14 crc kubenswrapper[4764]: I1204 01:59:14.061171 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:15 crc kubenswrapper[4764]: I1204 01:59:15.117230 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w6p7n" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="registry-server" probeResult="failure" output=< Dec 04 01:59:15 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 01:59:15 crc kubenswrapper[4764]: > Dec 04 01:59:24 crc kubenswrapper[4764]: I1204 01:59:24.151231 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:24 crc kubenswrapper[4764]: I1204 01:59:24.230516 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:24 crc kubenswrapper[4764]: I1204 01:59:24.413904 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.374098 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w6p7n" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="registry-server" containerID="cri-o://d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7" gracePeriod=2 Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.546686 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:59:25 crc kubenswrapper[4764]: E1204 01:59:25.547166 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.891059 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.985559 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities\") pod \"a88322aa-72b1-4f8a-8b51-6fddac445fae\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.985762 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content\") pod \"a88322aa-72b1-4f8a-8b51-6fddac445fae\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " Dec 04 01:59:25 crc kubenswrapper[4764]: I1204 01:59:25.986598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities" (OuterVolumeSpecName: "utilities") pod "a88322aa-72b1-4f8a-8b51-6fddac445fae" (UID: "a88322aa-72b1-4f8a-8b51-6fddac445fae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.087639 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5kl2\" (UniqueName: \"kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2\") pod \"a88322aa-72b1-4f8a-8b51-6fddac445fae\" (UID: \"a88322aa-72b1-4f8a-8b51-6fddac445fae\") " Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.088223 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.097504 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2" (OuterVolumeSpecName: "kube-api-access-k5kl2") pod "a88322aa-72b1-4f8a-8b51-6fddac445fae" (UID: "a88322aa-72b1-4f8a-8b51-6fddac445fae"). InnerVolumeSpecName "kube-api-access-k5kl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.147282 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a88322aa-72b1-4f8a-8b51-6fddac445fae" (UID: "a88322aa-72b1-4f8a-8b51-6fddac445fae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.189918 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a88322aa-72b1-4f8a-8b51-6fddac445fae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.189952 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5kl2\" (UniqueName: \"kubernetes.io/projected/a88322aa-72b1-4f8a-8b51-6fddac445fae-kube-api-access-k5kl2\") on node \"crc\" DevicePath \"\"" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.394993 4764 generic.go:334] "Generic (PLEG): container finished" podID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerID="d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7" exitCode=0 Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.395063 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w6p7n" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.395111 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerDied","Data":"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7"} Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.395190 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w6p7n" event={"ID":"a88322aa-72b1-4f8a-8b51-6fddac445fae","Type":"ContainerDied","Data":"0b80b4a200c71f48d8b6adbe31041660119ae2306512f333e2efcdd685822b24"} Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.395221 4764 scope.go:117] "RemoveContainer" containerID="d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.434407 4764 scope.go:117] "RemoveContainer" containerID="d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.472937 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.487758 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w6p7n"] Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.495179 4764 scope.go:117] "RemoveContainer" containerID="7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.559834 4764 scope.go:117] "RemoveContainer" containerID="d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7" Dec 04 01:59:26 crc kubenswrapper[4764]: E1204 01:59:26.560281 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7\": container with ID starting with d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7 not found: ID does not exist" containerID="d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.560317 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7"} err="failed to get container status \"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7\": rpc error: code = NotFound desc = could not find container \"d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7\": container with ID starting with d8dbb7e0970f8002da5d47e03e9dcca003b23eb71efc6c8bf6fd3c377007ffd7 not found: ID does not exist" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.560337 4764 scope.go:117] "RemoveContainer" containerID="d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba" Dec 04 01:59:26 crc kubenswrapper[4764]: E1204 01:59:26.560776 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba\": container with ID starting with d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba not found: ID does not exist" containerID="d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.560848 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba"} err="failed to get container status \"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba\": rpc error: code = NotFound desc = could not find container \"d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba\": container with ID starting with d7e4876c3137de9fe5b914568891fe14cda3575532940038862bd278c72dfcba not found: ID does not exist" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.560901 4764 scope.go:117] "RemoveContainer" containerID="7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2" Dec 04 01:59:26 crc kubenswrapper[4764]: E1204 01:59:26.561333 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2\": container with ID starting with 7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2 not found: ID does not exist" containerID="7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.561394 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2"} err="failed to get container status \"7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2\": rpc error: code = NotFound desc = could not find container \"7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2\": container with ID starting with 7aa8daeae0453c861532747b6a054cd5da103355de4012935dca6cc8d16d2bd2 not found: ID does not exist" Dec 04 01:59:26 crc kubenswrapper[4764]: I1204 01:59:26.569627 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" path="/var/lib/kubelet/pods/a88322aa-72b1-4f8a-8b51-6fddac445fae/volumes" Dec 04 01:59:40 crc kubenswrapper[4764]: I1204 01:59:40.547567 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:59:40 crc kubenswrapper[4764]: E1204 01:59:40.548444 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 01:59:54 crc kubenswrapper[4764]: I1204 01:59:54.554261 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 01:59:54 crc kubenswrapper[4764]: E1204 01:59:54.555538 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.165030 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns"] Dec 04 02:00:00 crc kubenswrapper[4764]: E1204 02:00:00.166823 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="extract-utilities" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.166861 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="extract-utilities" Dec 04 02:00:00 crc kubenswrapper[4764]: E1204 02:00:00.166903 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="registry-server" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.166922 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="registry-server" Dec 04 02:00:00 crc kubenswrapper[4764]: E1204 02:00:00.166950 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="extract-content" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.166968 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="extract-content" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.167538 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88322aa-72b1-4f8a-8b51-6fddac445fae" containerName="registry-server" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.169509 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.171863 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.172203 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.181175 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns"] Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.261529 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4c8b\" (UniqueName: \"kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.261592 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.261702 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.363796 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.363993 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.364167 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4c8b\" (UniqueName: \"kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.364737 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.375346 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.393350 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4c8b\" (UniqueName: \"kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b\") pod \"collect-profiles-29413560-p8cns\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:00 crc kubenswrapper[4764]: I1204 02:00:00.494546 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:01 crc kubenswrapper[4764]: I1204 02:00:01.055043 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns"] Dec 04 02:00:01 crc kubenswrapper[4764]: I1204 02:00:01.860542 4764 generic.go:334] "Generic (PLEG): container finished" podID="3daa185c-2782-4ba6-b9c6-c572a14822d2" containerID="070ce0dba951756fcf8e01f60e220647e2e7d1772185a30506f96237b0cd26e5" exitCode=0 Dec 04 02:00:01 crc kubenswrapper[4764]: I1204 02:00:01.860618 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" event={"ID":"3daa185c-2782-4ba6-b9c6-c572a14822d2","Type":"ContainerDied","Data":"070ce0dba951756fcf8e01f60e220647e2e7d1772185a30506f96237b0cd26e5"} Dec 04 02:00:01 crc kubenswrapper[4764]: I1204 02:00:01.860984 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" event={"ID":"3daa185c-2782-4ba6-b9c6-c572a14822d2","Type":"ContainerStarted","Data":"8ace37dd964643053c2fc8a81f032b7dc85db19a640e23e604fc34495b72665d"} Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.371368 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.435581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume\") pod \"3daa185c-2782-4ba6-b9c6-c572a14822d2\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.435669 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4c8b\" (UniqueName: \"kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b\") pod \"3daa185c-2782-4ba6-b9c6-c572a14822d2\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.435850 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume\") pod \"3daa185c-2782-4ba6-b9c6-c572a14822d2\" (UID: \"3daa185c-2782-4ba6-b9c6-c572a14822d2\") " Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.436832 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "3daa185c-2782-4ba6-b9c6-c572a14822d2" (UID: "3daa185c-2782-4ba6-b9c6-c572a14822d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.443619 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b" (OuterVolumeSpecName: "kube-api-access-r4c8b") pod "3daa185c-2782-4ba6-b9c6-c572a14822d2" (UID: "3daa185c-2782-4ba6-b9c6-c572a14822d2"). InnerVolumeSpecName "kube-api-access-r4c8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.444820 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3daa185c-2782-4ba6-b9c6-c572a14822d2" (UID: "3daa185c-2782-4ba6-b9c6-c572a14822d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.538942 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3daa185c-2782-4ba6-b9c6-c572a14822d2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.539001 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4c8b\" (UniqueName: \"kubernetes.io/projected/3daa185c-2782-4ba6-b9c6-c572a14822d2-kube-api-access-r4c8b\") on node \"crc\" DevicePath \"\"" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.539024 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3daa185c-2782-4ba6-b9c6-c572a14822d2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.892131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" event={"ID":"3daa185c-2782-4ba6-b9c6-c572a14822d2","Type":"ContainerDied","Data":"8ace37dd964643053c2fc8a81f032b7dc85db19a640e23e604fc34495b72665d"} Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.892194 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ace37dd964643053c2fc8a81f032b7dc85db19a640e23e604fc34495b72665d" Dec 04 02:00:03 crc kubenswrapper[4764]: I1204 02:00:03.892258 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413560-p8cns" Dec 04 02:00:04 crc kubenswrapper[4764]: I1204 02:00:04.482153 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k"] Dec 04 02:00:04 crc kubenswrapper[4764]: I1204 02:00:04.494824 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413515-vws9k"] Dec 04 02:00:04 crc kubenswrapper[4764]: I1204 02:00:04.568835 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d933b64c-ee5e-4ad7-b62d-36db00ebef8b" path="/var/lib/kubelet/pods/d933b64c-ee5e-4ad7-b62d-36db00ebef8b/volumes" Dec 04 02:00:09 crc kubenswrapper[4764]: I1204 02:00:09.548057 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:00:09 crc kubenswrapper[4764]: E1204 02:00:09.549329 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:00:22 crc kubenswrapper[4764]: I1204 02:00:22.201108 4764 scope.go:117] "RemoveContainer" containerID="a2b92b56099585bb86e902a2ad8678367bfc0b765db991f43d8d751f33ec46bb" Dec 04 02:00:23 crc kubenswrapper[4764]: I1204 02:00:23.546621 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:00:23 crc kubenswrapper[4764]: E1204 02:00:23.547312 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:00:34 crc kubenswrapper[4764]: I1204 02:00:34.555300 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:00:34 crc kubenswrapper[4764]: E1204 02:00:34.556530 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:00:48 crc kubenswrapper[4764]: I1204 02:00:48.546534 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:00:48 crc kubenswrapper[4764]: E1204 02:00:48.547225 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.160350 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29413561-s55gq"] Dec 04 02:01:00 crc kubenswrapper[4764]: E1204 02:01:00.161434 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3daa185c-2782-4ba6-b9c6-c572a14822d2" containerName="collect-profiles" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.161450 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3daa185c-2782-4ba6-b9c6-c572a14822d2" containerName="collect-profiles" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.161736 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3daa185c-2782-4ba6-b9c6-c572a14822d2" containerName="collect-profiles" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.162610 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.205835 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413561-s55gq"] Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.249422 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.249608 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9z6\" (UniqueName: \"kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.249659 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.249766 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.352669 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.353033 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.353171 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.353949 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9z6\" (UniqueName: \"kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.360183 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.360506 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.381598 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.385608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9z6\" (UniqueName: \"kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6\") pod \"keystone-cron-29413561-s55gq\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:00 crc kubenswrapper[4764]: I1204 02:01:00.499309 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:01 crc kubenswrapper[4764]: I1204 02:01:01.020811 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413561-s55gq"] Dec 04 02:01:01 crc kubenswrapper[4764]: I1204 02:01:01.623836 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413561-s55gq" event={"ID":"22911e90-422c-404d-8b0a-1e1fdd0f731a","Type":"ContainerStarted","Data":"29c838ace8c61b83f5e2b0f6892ace916d5a6db264a471b89e5997b1ab408210"} Dec 04 02:01:01 crc kubenswrapper[4764]: I1204 02:01:01.624164 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413561-s55gq" event={"ID":"22911e90-422c-404d-8b0a-1e1fdd0f731a","Type":"ContainerStarted","Data":"d8b6acfe9d03c9fbc7178ac7420fb3102ccabaead2d1a31552a86c19e4a1d50d"} Dec 04 02:01:01 crc kubenswrapper[4764]: I1204 02:01:01.656415 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29413561-s55gq" podStartSLOduration=1.656379834 podStartE2EDuration="1.656379834s" podCreationTimestamp="2025-12-04 02:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:01:01.644698917 +0000 UTC m=+8397.406023328" watchObservedRunningTime="2025-12-04 02:01:01.656379834 +0000 UTC m=+8397.417704285" Dec 04 02:01:03 crc kubenswrapper[4764]: I1204 02:01:03.546237 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:01:03 crc kubenswrapper[4764]: E1204 02:01:03.546834 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:01:03 crc kubenswrapper[4764]: I1204 02:01:03.644567 4764 generic.go:334] "Generic (PLEG): container finished" podID="22911e90-422c-404d-8b0a-1e1fdd0f731a" containerID="29c838ace8c61b83f5e2b0f6892ace916d5a6db264a471b89e5997b1ab408210" exitCode=0 Dec 04 02:01:03 crc kubenswrapper[4764]: I1204 02:01:03.644667 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413561-s55gq" event={"ID":"22911e90-422c-404d-8b0a-1e1fdd0f731a","Type":"ContainerDied","Data":"29c838ace8c61b83f5e2b0f6892ace916d5a6db264a471b89e5997b1ab408210"} Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.042115 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.173299 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys\") pod \"22911e90-422c-404d-8b0a-1e1fdd0f731a\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.173405 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9z6\" (UniqueName: \"kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6\") pod \"22911e90-422c-404d-8b0a-1e1fdd0f731a\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.173514 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data\") pod \"22911e90-422c-404d-8b0a-1e1fdd0f731a\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.173635 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle\") pod \"22911e90-422c-404d-8b0a-1e1fdd0f731a\" (UID: \"22911e90-422c-404d-8b0a-1e1fdd0f731a\") " Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.180341 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "22911e90-422c-404d-8b0a-1e1fdd0f731a" (UID: "22911e90-422c-404d-8b0a-1e1fdd0f731a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.181133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6" (OuterVolumeSpecName: "kube-api-access-7w9z6") pod "22911e90-422c-404d-8b0a-1e1fdd0f731a" (UID: "22911e90-422c-404d-8b0a-1e1fdd0f731a"). InnerVolumeSpecName "kube-api-access-7w9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.238100 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22911e90-422c-404d-8b0a-1e1fdd0f731a" (UID: "22911e90-422c-404d-8b0a-1e1fdd0f731a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.276127 4764 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.276164 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9z6\" (UniqueName: \"kubernetes.io/projected/22911e90-422c-404d-8b0a-1e1fdd0f731a-kube-api-access-7w9z6\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.276177 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.286189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data" (OuterVolumeSpecName: "config-data") pod "22911e90-422c-404d-8b0a-1e1fdd0f731a" (UID: "22911e90-422c-404d-8b0a-1e1fdd0f731a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.378083 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22911e90-422c-404d-8b0a-1e1fdd0f731a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.664231 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413561-s55gq" event={"ID":"22911e90-422c-404d-8b0a-1e1fdd0f731a","Type":"ContainerDied","Data":"d8b6acfe9d03c9fbc7178ac7420fb3102ccabaead2d1a31552a86c19e4a1d50d"} Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.664545 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b6acfe9d03c9fbc7178ac7420fb3102ccabaead2d1a31552a86c19e4a1d50d" Dec 04 02:01:05 crc kubenswrapper[4764]: I1204 02:01:05.664332 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413561-s55gq" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.777776 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:16 crc kubenswrapper[4764]: E1204 02:01:16.778989 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22911e90-422c-404d-8b0a-1e1fdd0f731a" containerName="keystone-cron" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.779007 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="22911e90-422c-404d-8b0a-1e1fdd0f731a" containerName="keystone-cron" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.779286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="22911e90-422c-404d-8b0a-1e1fdd0f731a" containerName="keystone-cron" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.781299 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.815226 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.905788 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.906138 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:16 crc kubenswrapper[4764]: I1204 02:01:16.906370 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xb8\" (UniqueName: \"kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.008643 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.008755 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xb8\" (UniqueName: \"kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.008896 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.009179 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.009289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.026596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xb8\" (UniqueName: \"kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8\") pod \"community-operators-f8p5s\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.100917 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.645485 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:17 crc kubenswrapper[4764]: I1204 02:01:17.815873 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerStarted","Data":"2e4e882a32c6b5e21b59940a9e0fd4a173abdebc02c1f29beb0d326daaaec88a"} Dec 04 02:01:18 crc kubenswrapper[4764]: I1204 02:01:18.546319 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:01:18 crc kubenswrapper[4764]: E1204 02:01:18.546669 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:01:18 crc kubenswrapper[4764]: I1204 02:01:18.829112 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerID="a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc" exitCode=0 Dec 04 02:01:18 crc kubenswrapper[4764]: I1204 02:01:18.829153 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerDied","Data":"a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc"} Dec 04 02:01:19 crc kubenswrapper[4764]: I1204 02:01:19.844401 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerStarted","Data":"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29"} Dec 04 02:01:20 crc kubenswrapper[4764]: I1204 02:01:20.859982 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerID="f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29" exitCode=0 Dec 04 02:01:20 crc kubenswrapper[4764]: I1204 02:01:20.860057 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerDied","Data":"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29"} Dec 04 02:01:21 crc kubenswrapper[4764]: I1204 02:01:21.873686 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerStarted","Data":"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed"} Dec 04 02:01:21 crc kubenswrapper[4764]: I1204 02:01:21.909412 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f8p5s" podStartSLOduration=3.38547921 podStartE2EDuration="5.909393388s" podCreationTimestamp="2025-12-04 02:01:16 +0000 UTC" firstStartedPulling="2025-12-04 02:01:18.832178017 +0000 UTC m=+8414.593502468" lastFinishedPulling="2025-12-04 02:01:21.356092215 +0000 UTC m=+8417.117416646" observedRunningTime="2025-12-04 02:01:21.901899314 +0000 UTC m=+8417.663223735" watchObservedRunningTime="2025-12-04 02:01:21.909393388 +0000 UTC m=+8417.670717799" Dec 04 02:01:27 crc kubenswrapper[4764]: I1204 02:01:27.102305 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:27 crc kubenswrapper[4764]: I1204 02:01:27.102729 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:27 crc kubenswrapper[4764]: I1204 02:01:27.196602 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:27 crc kubenswrapper[4764]: I1204 02:01:27.979390 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:28 crc kubenswrapper[4764]: I1204 02:01:28.027927 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:29 crc kubenswrapper[4764]: I1204 02:01:29.953334 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f8p5s" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="registry-server" containerID="cri-o://b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed" gracePeriod=2 Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.965826 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.973034 4764 generic.go:334] "Generic (PLEG): container finished" podID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerID="b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed" exitCode=0 Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.973331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerDied","Data":"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed"} Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.973520 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f8p5s" event={"ID":"c5993009-ed7b-4d04-afeb-ce5521f4f581","Type":"ContainerDied","Data":"2e4e882a32c6b5e21b59940a9e0fd4a173abdebc02c1f29beb0d326daaaec88a"} Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.973747 4764 scope.go:117] "RemoveContainer" containerID="b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed" Dec 04 02:01:30 crc kubenswrapper[4764]: I1204 02:01:30.974169 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f8p5s" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.005452 4764 scope.go:117] "RemoveContainer" containerID="f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.094663 4764 scope.go:117] "RemoveContainer" containerID="a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.133945 4764 scope.go:117] "RemoveContainer" containerID="b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed" Dec 04 02:01:31 crc kubenswrapper[4764]: E1204 02:01:31.134509 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed\": container with ID starting with b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed not found: ID does not exist" containerID="b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.134556 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed"} err="failed to get container status \"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed\": rpc error: code = NotFound desc = could not find container \"b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed\": container with ID starting with b63347e1cee31e36af97ff6dea9288423d6509c941dacfb08115e60dfb7d1fed not found: ID does not exist" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.134606 4764 scope.go:117] "RemoveContainer" containerID="f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29" Dec 04 02:01:31 crc kubenswrapper[4764]: E1204 02:01:31.134987 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29\": container with ID starting with f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29 not found: ID does not exist" containerID="f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.135049 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29"} err="failed to get container status \"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29\": rpc error: code = NotFound desc = could not find container \"f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29\": container with ID starting with f2195b9535509fca122293c64aabd22a21bf59d7680dd14bb654068614195b29 not found: ID does not exist" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.135090 4764 scope.go:117] "RemoveContainer" containerID="a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc" Dec 04 02:01:31 crc kubenswrapper[4764]: E1204 02:01:31.135391 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc\": container with ID starting with a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc not found: ID does not exist" containerID="a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.135418 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc"} err="failed to get container status \"a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc\": rpc error: code = NotFound desc = could not find container \"a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc\": container with ID starting with a4b978c07c2d2f7f7924ec81221d9c6179ab988776b90beeec2705bd7b484afc not found: ID does not exist" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.139703 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content\") pod \"c5993009-ed7b-4d04-afeb-ce5521f4f581\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.139982 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xb8\" (UniqueName: \"kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8\") pod \"c5993009-ed7b-4d04-afeb-ce5521f4f581\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.140025 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities\") pod \"c5993009-ed7b-4d04-afeb-ce5521f4f581\" (UID: \"c5993009-ed7b-4d04-afeb-ce5521f4f581\") " Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.141259 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities" (OuterVolumeSpecName: "utilities") pod "c5993009-ed7b-4d04-afeb-ce5521f4f581" (UID: "c5993009-ed7b-4d04-afeb-ce5521f4f581"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.149956 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8" (OuterVolumeSpecName: "kube-api-access-v8xb8") pod "c5993009-ed7b-4d04-afeb-ce5521f4f581" (UID: "c5993009-ed7b-4d04-afeb-ce5521f4f581"). InnerVolumeSpecName "kube-api-access-v8xb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.197260 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5993009-ed7b-4d04-afeb-ce5521f4f581" (UID: "c5993009-ed7b-4d04-afeb-ce5521f4f581"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.243871 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8xb8\" (UniqueName: \"kubernetes.io/projected/c5993009-ed7b-4d04-afeb-ce5521f4f581-kube-api-access-v8xb8\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.244020 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.244043 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5993009-ed7b-4d04-afeb-ce5521f4f581-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.320926 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:31 crc kubenswrapper[4764]: I1204 02:01:31.331770 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f8p5s"] Dec 04 02:01:32 crc kubenswrapper[4764]: I1204 02:01:32.556958 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" path="/var/lib/kubelet/pods/c5993009-ed7b-4d04-afeb-ce5521f4f581/volumes" Dec 04 02:01:33 crc kubenswrapper[4764]: I1204 02:01:33.545698 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:01:33 crc kubenswrapper[4764]: E1204 02:01:33.546415 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:01:45 crc kubenswrapper[4764]: I1204 02:01:45.546457 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:01:45 crc kubenswrapper[4764]: E1204 02:01:45.547408 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:01:52 crc kubenswrapper[4764]: I1204 02:01:52.226879 4764 generic.go:334] "Generic (PLEG): container finished" podID="3e94945c-c71a-4fd7-95d1-8609b7bc068e" containerID="4f858fdef975ed85451d9aee6dae8d09e5601c27335852005b2e1193d6618e15" exitCode=0 Dec 04 02:01:52 crc kubenswrapper[4764]: I1204 02:01:52.227432 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" event={"ID":"3e94945c-c71a-4fd7-95d1-8609b7bc068e","Type":"ContainerDied","Data":"4f858fdef975ed85451d9aee6dae8d09e5601c27335852005b2e1193d6618e15"} Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.704999 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.774543 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.775385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.775693 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.776008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.776078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75wt\" (UniqueName: \"kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.776155 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.776181 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:53 crc kubenswrapper[4764]: I1204 02:01:53.776233 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph\") pod \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\" (UID: \"3e94945c-c71a-4fd7-95d1-8609b7bc068e\") " Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.043398 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt" (OuterVolumeSpecName: "kube-api-access-z75wt") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "kube-api-access-z75wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.044189 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph" (OuterVolumeSpecName: "ceph") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.044554 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.050213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.050271 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.050298 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory" (OuterVolumeSpecName: "inventory") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.050289 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.052374 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "3e94945c-c71a-4fd7-95d1-8609b7bc068e" (UID: "3e94945c-c71a-4fd7-95d1-8609b7bc068e"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082146 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082186 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082202 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082215 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082229 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082242 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75wt\" (UniqueName: \"kubernetes.io/projected/3e94945c-c71a-4fd7-95d1-8609b7bc068e-kube-api-access-z75wt\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082255 4764 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.082267 4764 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e94945c-c71a-4fd7-95d1-8609b7bc068e-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.251159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" event={"ID":"3e94945c-c71a-4fd7-95d1-8609b7bc068e","Type":"ContainerDied","Data":"a75d5f8f4ac5b3fc3d0d7300433e8ee79b198b5edd62606251ef535552a4f4dc"} Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.251203 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75d5f8f4ac5b3fc3d0d7300433e8ee79b198b5edd62606251ef535552a4f4dc" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.251575 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-t7qqm" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.353681 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-crnht"] Dec 04 02:01:54 crc kubenswrapper[4764]: E1204 02:01:54.354312 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e94945c-c71a-4fd7-95d1-8609b7bc068e" containerName="telemetry-openstack-openstack-cell1" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354334 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e94945c-c71a-4fd7-95d1-8609b7bc068e" containerName="telemetry-openstack-openstack-cell1" Dec 04 02:01:54 crc kubenswrapper[4764]: E1204 02:01:54.354353 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="extract-utilities" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354363 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="extract-utilities" Dec 04 02:01:54 crc kubenswrapper[4764]: E1204 02:01:54.354407 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="registry-server" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354417 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="registry-server" Dec 04 02:01:54 crc kubenswrapper[4764]: E1204 02:01:54.354432 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="extract-content" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354441 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="extract-content" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354700 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e94945c-c71a-4fd7-95d1-8609b7bc068e" containerName="telemetry-openstack-openstack-cell1" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.354715 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5993009-ed7b-4d04-afeb-ce5521f4f581" containerName="registry-server" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.355704 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.361420 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.361798 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.361994 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.362239 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.364497 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.369064 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-crnht"] Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.388803 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.388856 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.388983 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.389053 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.389086 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q7vb\" (UniqueName: \"kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.389223 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.490890 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.490942 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q7vb\" (UniqueName: \"kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.491083 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.491135 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.491160 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.491240 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.494709 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.494909 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.495187 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.495265 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.495782 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.507608 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q7vb\" (UniqueName: \"kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb\") pod \"neutron-sriov-openstack-openstack-cell1-crnht\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:54 crc kubenswrapper[4764]: I1204 02:01:54.678006 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:01:55 crc kubenswrapper[4764]: I1204 02:01:55.297924 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-crnht"] Dec 04 02:01:56 crc kubenswrapper[4764]: I1204 02:01:56.273015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" event={"ID":"b8f75c01-301b-43f4-9d15-ad19080d1ba9","Type":"ContainerStarted","Data":"9af366115a7d78a9b94525085a65174a34e31428ec868f4f67448f0b07ef6d66"} Dec 04 02:01:56 crc kubenswrapper[4764]: I1204 02:01:56.273384 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" event={"ID":"b8f75c01-301b-43f4-9d15-ad19080d1ba9","Type":"ContainerStarted","Data":"c52192dc9c2eabfde8ad0136eebd0de9eb5cf8b2884f636b89c41759da00a6a5"} Dec 04 02:01:56 crc kubenswrapper[4764]: I1204 02:01:56.296486 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" podStartSLOduration=1.854006091 podStartE2EDuration="2.296469753s" podCreationTimestamp="2025-12-04 02:01:54 +0000 UTC" firstStartedPulling="2025-12-04 02:01:55.305585617 +0000 UTC m=+8451.066910028" lastFinishedPulling="2025-12-04 02:01:55.748049279 +0000 UTC m=+8451.509373690" observedRunningTime="2025-12-04 02:01:56.293490319 +0000 UTC m=+8452.054814740" watchObservedRunningTime="2025-12-04 02:01:56.296469753 +0000 UTC m=+8452.057794164" Dec 04 02:01:57 crc kubenswrapper[4764]: I1204 02:01:57.546043 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:01:57 crc kubenswrapper[4764]: E1204 02:01:57.546777 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:02:11 crc kubenswrapper[4764]: I1204 02:02:11.546694 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:02:11 crc kubenswrapper[4764]: E1204 02:02:11.550133 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:02:25 crc kubenswrapper[4764]: I1204 02:02:25.546044 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:02:26 crc kubenswrapper[4764]: I1204 02:02:26.618952 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5"} Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.242091 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.249866 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.275542 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.433048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.433292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.433366 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrfl\" (UniqueName: \"kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.535092 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.535213 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrfl\" (UniqueName: \"kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.535247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.535640 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.535893 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.557841 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrfl\" (UniqueName: \"kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl\") pod \"redhat-marketplace-9nskz\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:07 crc kubenswrapper[4764]: I1204 02:04:07.575643 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:08 crc kubenswrapper[4764]: I1204 02:04:08.061495 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:08 crc kubenswrapper[4764]: I1204 02:04:08.890076 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerID="b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98" exitCode=0 Dec 04 02:04:08 crc kubenswrapper[4764]: I1204 02:04:08.890192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerDied","Data":"b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98"} Dec 04 02:04:08 crc kubenswrapper[4764]: I1204 02:04:08.890522 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerStarted","Data":"bcef91686ee06ffce0e320c26d1aa0c3f9a8d287659dbbaf44f44cf388cbd383"} Dec 04 02:04:08 crc kubenswrapper[4764]: I1204 02:04:08.893182 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 02:04:09 crc kubenswrapper[4764]: I1204 02:04:09.915241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerStarted","Data":"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db"} Dec 04 02:04:10 crc kubenswrapper[4764]: I1204 02:04:10.932233 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerID="e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db" exitCode=0 Dec 04 02:04:10 crc kubenswrapper[4764]: I1204 02:04:10.932320 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerDied","Data":"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db"} Dec 04 02:04:11 crc kubenswrapper[4764]: I1204 02:04:11.945455 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerStarted","Data":"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814"} Dec 04 02:04:11 crc kubenswrapper[4764]: I1204 02:04:11.970626 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9nskz" podStartSLOduration=2.512268383 podStartE2EDuration="4.970606511s" podCreationTimestamp="2025-12-04 02:04:07 +0000 UTC" firstStartedPulling="2025-12-04 02:04:08.892864777 +0000 UTC m=+8584.654189198" lastFinishedPulling="2025-12-04 02:04:11.351202895 +0000 UTC m=+8587.112527326" observedRunningTime="2025-12-04 02:04:11.970469417 +0000 UTC m=+8587.731793828" watchObservedRunningTime="2025-12-04 02:04:11.970606511 +0000 UTC m=+8587.731930922" Dec 04 02:04:17 crc kubenswrapper[4764]: I1204 02:04:17.575858 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:17 crc kubenswrapper[4764]: I1204 02:04:17.576427 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:17 crc kubenswrapper[4764]: I1204 02:04:17.622967 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:18 crc kubenswrapper[4764]: I1204 02:04:18.082251 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:18 crc kubenswrapper[4764]: I1204 02:04:18.155499 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.040022 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9nskz" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="registry-server" containerID="cri-o://bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814" gracePeriod=2 Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.560647 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.698812 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzrfl\" (UniqueName: \"kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl\") pod \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.698873 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities\") pod \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.699036 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content\") pod \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\" (UID: \"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246\") " Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.701702 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities" (OuterVolumeSpecName: "utilities") pod "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" (UID: "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.705637 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl" (OuterVolumeSpecName: "kube-api-access-jzrfl") pod "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" (UID: "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246"). InnerVolumeSpecName "kube-api-access-jzrfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.735275 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" (UID: "7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.804415 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzrfl\" (UniqueName: \"kubernetes.io/projected/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-kube-api-access-jzrfl\") on node \"crc\" DevicePath \"\"" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.804487 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:04:20 crc kubenswrapper[4764]: I1204 02:04:20.804518 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.061502 4764 generic.go:334] "Generic (PLEG): container finished" podID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerID="bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814" exitCode=0 Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.061678 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerDied","Data":"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814"} Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.061693 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nskz" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.061781 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nskz" event={"ID":"7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246","Type":"ContainerDied","Data":"bcef91686ee06ffce0e320c26d1aa0c3f9a8d287659dbbaf44f44cf388cbd383"} Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.061800 4764 scope.go:117] "RemoveContainer" containerID="bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.111057 4764 scope.go:117] "RemoveContainer" containerID="e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.129225 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.143061 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nskz"] Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.162857 4764 scope.go:117] "RemoveContainer" containerID="b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.217775 4764 scope.go:117] "RemoveContainer" containerID="bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814" Dec 04 02:04:21 crc kubenswrapper[4764]: E1204 02:04:21.218394 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814\": container with ID starting with bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814 not found: ID does not exist" containerID="bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.218453 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814"} err="failed to get container status \"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814\": rpc error: code = NotFound desc = could not find container \"bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814\": container with ID starting with bb94eb4876f14a936a8cb9c6aa803c13d9fd1a323a93f715002466402891e814 not found: ID does not exist" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.218491 4764 scope.go:117] "RemoveContainer" containerID="e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db" Dec 04 02:04:21 crc kubenswrapper[4764]: E1204 02:04:21.219211 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db\": container with ID starting with e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db not found: ID does not exist" containerID="e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.219273 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db"} err="failed to get container status \"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db\": rpc error: code = NotFound desc = could not find container \"e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db\": container with ID starting with e6ab9b314ee5befed23deaf6ad005e59370d1ef33522642c4489686998bf19db not found: ID does not exist" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.219310 4764 scope.go:117] "RemoveContainer" containerID="b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98" Dec 04 02:04:21 crc kubenswrapper[4764]: E1204 02:04:21.219848 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98\": container with ID starting with b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98 not found: ID does not exist" containerID="b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98" Dec 04 02:04:21 crc kubenswrapper[4764]: I1204 02:04:21.219903 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98"} err="failed to get container status \"b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98\": rpc error: code = NotFound desc = could not find container \"b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98\": container with ID starting with b1ce014829ea1fbf0cdeb6b38a43fb9eae4bf7cd3ba0f9232a92e1b7db093a98 not found: ID does not exist" Dec 04 02:04:22 crc kubenswrapper[4764]: I1204 02:04:22.561543 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" path="/var/lib/kubelet/pods/7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246/volumes" Dec 04 02:04:50 crc kubenswrapper[4764]: I1204 02:04:50.868769 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:04:50 crc kubenswrapper[4764]: I1204 02:04:50.869415 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.944504 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:06 crc kubenswrapper[4764]: E1204 02:05:06.946000 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="extract-content" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.946035 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="extract-content" Dec 04 02:05:06 crc kubenswrapper[4764]: E1204 02:05:06.946108 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="registry-server" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.946126 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="registry-server" Dec 04 02:05:06 crc kubenswrapper[4764]: E1204 02:05:06.946187 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="extract-utilities" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.946207 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="extract-utilities" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.946871 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbcd5a1-f716-4dbb-8ffb-cf9f3b9c0246" containerName="registry-server" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.951800 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:06 crc kubenswrapper[4764]: I1204 02:05:06.981924 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.052320 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqqlt\" (UniqueName: \"kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.052846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.053295 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.159411 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.159509 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqqlt\" (UniqueName: \"kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.159651 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.160477 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.160566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.187941 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqqlt\" (UniqueName: \"kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt\") pod \"certified-operators-97f6q\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.282824 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:07 crc kubenswrapper[4764]: I1204 02:05:07.852494 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:08 crc kubenswrapper[4764]: I1204 02:05:08.654818 4764 generic.go:334] "Generic (PLEG): container finished" podID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerID="f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78" exitCode=0 Dec 04 02:05:08 crc kubenswrapper[4764]: I1204 02:05:08.654878 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerDied","Data":"f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78"} Dec 04 02:05:08 crc kubenswrapper[4764]: I1204 02:05:08.655637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerStarted","Data":"1be315516c7abbee574ad9d759dc79daac616493fa29cc757bfadb2e7ed087b9"} Dec 04 02:05:09 crc kubenswrapper[4764]: I1204 02:05:09.672650 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerStarted","Data":"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2"} Dec 04 02:05:10 crc kubenswrapper[4764]: I1204 02:05:10.689261 4764 generic.go:334] "Generic (PLEG): container finished" podID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerID="fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2" exitCode=0 Dec 04 02:05:10 crc kubenswrapper[4764]: I1204 02:05:10.689326 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerDied","Data":"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2"} Dec 04 02:05:11 crc kubenswrapper[4764]: I1204 02:05:11.703359 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerStarted","Data":"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364"} Dec 04 02:05:11 crc kubenswrapper[4764]: I1204 02:05:11.726481 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-97f6q" podStartSLOduration=3.249249585 podStartE2EDuration="5.72646399s" podCreationTimestamp="2025-12-04 02:05:06 +0000 UTC" firstStartedPulling="2025-12-04 02:05:08.659347611 +0000 UTC m=+8644.420672042" lastFinishedPulling="2025-12-04 02:05:11.136562006 +0000 UTC m=+8646.897886447" observedRunningTime="2025-12-04 02:05:11.721911918 +0000 UTC m=+8647.483236369" watchObservedRunningTime="2025-12-04 02:05:11.72646399 +0000 UTC m=+8647.487788401" Dec 04 02:05:17 crc kubenswrapper[4764]: I1204 02:05:17.283816 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:17 crc kubenswrapper[4764]: I1204 02:05:17.284419 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:17 crc kubenswrapper[4764]: I1204 02:05:17.330317 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:17 crc kubenswrapper[4764]: I1204 02:05:17.852437 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:17 crc kubenswrapper[4764]: I1204 02:05:17.916643 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:19 crc kubenswrapper[4764]: I1204 02:05:19.799521 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-97f6q" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="registry-server" containerID="cri-o://4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364" gracePeriod=2 Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.377510 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.550688 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqqlt\" (UniqueName: \"kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt\") pod \"685afb14-a588-44e2-bf5a-4fa12ba964e5\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.551078 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities\") pod \"685afb14-a588-44e2-bf5a-4fa12ba964e5\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.552604 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content\") pod \"685afb14-a588-44e2-bf5a-4fa12ba964e5\" (UID: \"685afb14-a588-44e2-bf5a-4fa12ba964e5\") " Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.552650 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities" (OuterVolumeSpecName: "utilities") pod "685afb14-a588-44e2-bf5a-4fa12ba964e5" (UID: "685afb14-a588-44e2-bf5a-4fa12ba964e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.554184 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.564345 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt" (OuterVolumeSpecName: "kube-api-access-jqqlt") pod "685afb14-a588-44e2-bf5a-4fa12ba964e5" (UID: "685afb14-a588-44e2-bf5a-4fa12ba964e5"). InnerVolumeSpecName "kube-api-access-jqqlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.627612 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "685afb14-a588-44e2-bf5a-4fa12ba964e5" (UID: "685afb14-a588-44e2-bf5a-4fa12ba964e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.655076 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/685afb14-a588-44e2-bf5a-4fa12ba964e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.655116 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqqlt\" (UniqueName: \"kubernetes.io/projected/685afb14-a588-44e2-bf5a-4fa12ba964e5-kube-api-access-jqqlt\") on node \"crc\" DevicePath \"\"" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.816608 4764 generic.go:334] "Generic (PLEG): container finished" podID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerID="4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364" exitCode=0 Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.816692 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-97f6q" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.816752 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerDied","Data":"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364"} Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.817145 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-97f6q" event={"ID":"685afb14-a588-44e2-bf5a-4fa12ba964e5","Type":"ContainerDied","Data":"1be315516c7abbee574ad9d759dc79daac616493fa29cc757bfadb2e7ed087b9"} Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.817207 4764 scope.go:117] "RemoveContainer" containerID="4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.848470 4764 scope.go:117] "RemoveContainer" containerID="fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.860123 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.869411 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.869522 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.870147 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-97f6q"] Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.889086 4764 scope.go:117] "RemoveContainer" containerID="f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.944312 4764 scope.go:117] "RemoveContainer" containerID="4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364" Dec 04 02:05:20 crc kubenswrapper[4764]: E1204 02:05:20.944678 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364\": container with ID starting with 4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364 not found: ID does not exist" containerID="4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.944746 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364"} err="failed to get container status \"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364\": rpc error: code = NotFound desc = could not find container \"4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364\": container with ID starting with 4c4168b4222f55e304f830161e50a30ab6de522e4634925306cb07d22842a364 not found: ID does not exist" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.944776 4764 scope.go:117] "RemoveContainer" containerID="fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2" Dec 04 02:05:20 crc kubenswrapper[4764]: E1204 02:05:20.945069 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2\": container with ID starting with fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2 not found: ID does not exist" containerID="fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.945097 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2"} err="failed to get container status \"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2\": rpc error: code = NotFound desc = could not find container \"fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2\": container with ID starting with fa8d563bec356c41bab2e1e78d13913a21335cb0337f27a883398c25e95b5dd2 not found: ID does not exist" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.945116 4764 scope.go:117] "RemoveContainer" containerID="f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78" Dec 04 02:05:20 crc kubenswrapper[4764]: E1204 02:05:20.945376 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78\": container with ID starting with f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78 not found: ID does not exist" containerID="f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78" Dec 04 02:05:20 crc kubenswrapper[4764]: I1204 02:05:20.945406 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78"} err="failed to get container status \"f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78\": rpc error: code = NotFound desc = could not find container \"f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78\": container with ID starting with f1fefb0639ff0fd19277045f76278d80bcd7fe4ed3b4c7f66d25674360bdaa78 not found: ID does not exist" Dec 04 02:05:22 crc kubenswrapper[4764]: I1204 02:05:22.558257 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" path="/var/lib/kubelet/pods/685afb14-a588-44e2-bf5a-4fa12ba964e5/volumes" Dec 04 02:05:50 crc kubenswrapper[4764]: I1204 02:05:50.869102 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:05:50 crc kubenswrapper[4764]: I1204 02:05:50.869935 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:05:50 crc kubenswrapper[4764]: I1204 02:05:50.870011 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:05:50 crc kubenswrapper[4764]: I1204 02:05:50.871245 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:05:50 crc kubenswrapper[4764]: I1204 02:05:50.871356 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5" gracePeriod=600 Dec 04 02:05:51 crc kubenswrapper[4764]: I1204 02:05:51.265635 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5" exitCode=0 Dec 04 02:05:51 crc kubenswrapper[4764]: I1204 02:05:51.265706 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5"} Dec 04 02:05:51 crc kubenswrapper[4764]: I1204 02:05:51.266059 4764 scope.go:117] "RemoveContainer" containerID="5684754fd826178bf55b42f39d52cc73e4b36b766a35eaf1bb3a318c336232da" Dec 04 02:05:52 crc kubenswrapper[4764]: I1204 02:05:52.282163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b"} Dec 04 02:07:46 crc kubenswrapper[4764]: I1204 02:07:46.659105 4764 generic.go:334] "Generic (PLEG): container finished" podID="b8f75c01-301b-43f4-9d15-ad19080d1ba9" containerID="9af366115a7d78a9b94525085a65174a34e31428ec868f4f67448f0b07ef6d66" exitCode=0 Dec 04 02:07:46 crc kubenswrapper[4764]: I1204 02:07:46.659171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" event={"ID":"b8f75c01-301b-43f4-9d15-ad19080d1ba9","Type":"ContainerDied","Data":"9af366115a7d78a9b94525085a65174a34e31428ec868f4f67448f0b07ef6d66"} Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.199821 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.329939 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.330008 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.330186 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.330208 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.330231 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q7vb\" (UniqueName: \"kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.330297 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0\") pod \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\" (UID: \"b8f75c01-301b-43f4-9d15-ad19080d1ba9\") " Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.336060 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb" (OuterVolumeSpecName: "kube-api-access-2q7vb") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "kube-api-access-2q7vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.337066 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.338115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph" (OuterVolumeSpecName: "ceph") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.367226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.385598 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory" (OuterVolumeSpecName: "inventory") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.391587 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "b8f75c01-301b-43f4-9d15-ad19080d1ba9" (UID: "b8f75c01-301b-43f4-9d15-ad19080d1ba9"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432053 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432086 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432098 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432108 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432121 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b8f75c01-301b-43f4-9d15-ad19080d1ba9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.432132 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q7vb\" (UniqueName: \"kubernetes.io/projected/b8f75c01-301b-43f4-9d15-ad19080d1ba9-kube-api-access-2q7vb\") on node \"crc\" DevicePath \"\"" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.685594 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" event={"ID":"b8f75c01-301b-43f4-9d15-ad19080d1ba9","Type":"ContainerDied","Data":"c52192dc9c2eabfde8ad0136eebd0de9eb5cf8b2884f636b89c41759da00a6a5"} Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.686414 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c52192dc9c2eabfde8ad0136eebd0de9eb5cf8b2884f636b89c41759da00a6a5" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.685653 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-crnht" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.826173 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw"] Dec 04 02:07:48 crc kubenswrapper[4764]: E1204 02:07:48.826763 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="extract-utilities" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.826784 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="extract-utilities" Dec 04 02:07:48 crc kubenswrapper[4764]: E1204 02:07:48.826800 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="extract-content" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.826810 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="extract-content" Dec 04 02:07:48 crc kubenswrapper[4764]: E1204 02:07:48.826833 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="registry-server" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.826842 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="registry-server" Dec 04 02:07:48 crc kubenswrapper[4764]: E1204 02:07:48.826886 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8f75c01-301b-43f4-9d15-ad19080d1ba9" containerName="neutron-sriov-openstack-openstack-cell1" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.826896 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8f75c01-301b-43f4-9d15-ad19080d1ba9" containerName="neutron-sriov-openstack-openstack-cell1" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.827137 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="685afb14-a588-44e2-bf5a-4fa12ba964e5" containerName="registry-server" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.827164 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8f75c01-301b-43f4-9d15-ad19080d1ba9" containerName="neutron-sriov-openstack-openstack-cell1" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.828055 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.835219 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.835686 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.835819 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.836546 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.839638 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.886670 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw"] Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942100 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942248 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942289 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942316 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942362 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94v97\" (UniqueName: \"kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:48 crc kubenswrapper[4764]: I1204 02:07:48.942504 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044099 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044149 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044217 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94v97\" (UniqueName: \"kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044259 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044295 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.044399 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.048492 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.049128 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.049252 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.050021 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.055529 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.060125 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94v97\" (UniqueName: \"kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97\") pod \"neutron-dhcp-openstack-openstack-cell1-nj9tw\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.199832 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:07:49 crc kubenswrapper[4764]: I1204 02:07:49.766227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw"] Dec 04 02:07:49 crc kubenswrapper[4764]: W1204 02:07:49.768429 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod054cf6c8_5222_4ade_a2f3_53aebec044b4.slice/crio-48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59 WatchSource:0}: Error finding container 48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59: Status 404 returned error can't find the container with id 48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59 Dec 04 02:07:50 crc kubenswrapper[4764]: I1204 02:07:50.714707 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" event={"ID":"054cf6c8-5222-4ade-a2f3-53aebec044b4","Type":"ContainerStarted","Data":"32e42436d1bd1c0e5c0a5eaf191d2f527cf42a0fddc249b49d827faed76681c0"} Dec 04 02:07:50 crc kubenswrapper[4764]: I1204 02:07:50.715291 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" event={"ID":"054cf6c8-5222-4ade-a2f3-53aebec044b4","Type":"ContainerStarted","Data":"48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59"} Dec 04 02:07:50 crc kubenswrapper[4764]: I1204 02:07:50.755603 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" podStartSLOduration=2.31442511 podStartE2EDuration="2.755577422s" podCreationTimestamp="2025-12-04 02:07:48 +0000 UTC" firstStartedPulling="2025-12-04 02:07:49.771998251 +0000 UTC m=+8805.533322682" lastFinishedPulling="2025-12-04 02:07:50.213150583 +0000 UTC m=+8805.974474994" observedRunningTime="2025-12-04 02:07:50.738618256 +0000 UTC m=+8806.499942697" watchObservedRunningTime="2025-12-04 02:07:50.755577422 +0000 UTC m=+8806.516901843" Dec 04 02:08:20 crc kubenswrapper[4764]: I1204 02:08:20.868471 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:08:20 crc kubenswrapper[4764]: I1204 02:08:20.869154 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:08:50 crc kubenswrapper[4764]: I1204 02:08:50.869070 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:08:50 crc kubenswrapper[4764]: I1204 02:08:50.869654 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.460219 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.463956 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.474023 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.578948 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pcfg\" (UniqueName: \"kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.579177 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.579406 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.681797 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.682004 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.682123 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pcfg\" (UniqueName: \"kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.682484 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.683222 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.706808 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pcfg\" (UniqueName: \"kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg\") pod \"redhat-operators-ln6qq\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:19 crc kubenswrapper[4764]: I1204 02:09:19.789339 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.260064 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:20 crc kubenswrapper[4764]: W1204 02:09:20.267116 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9365ee_0a0b_4cb8_935a_4dbd49c3ab7a.slice/crio-7988a576ca1fbb8c9c569bbb4e2455955a3db8704335a1f02ad11ad0e24949c7 WatchSource:0}: Error finding container 7988a576ca1fbb8c9c569bbb4e2455955a3db8704335a1f02ad11ad0e24949c7: Status 404 returned error can't find the container with id 7988a576ca1fbb8c9c569bbb4e2455955a3db8704335a1f02ad11ad0e24949c7 Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.869279 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.869857 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.869929 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.871179 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:09:20 crc kubenswrapper[4764]: I1204 02:09:20.871237 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" gracePeriod=600 Dec 04 02:09:20 crc kubenswrapper[4764]: E1204 02:09:20.999210 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.126185 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerID="4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146" exitCode=0 Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.126243 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerDied","Data":"4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146"} Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.126538 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerStarted","Data":"7988a576ca1fbb8c9c569bbb4e2455955a3db8704335a1f02ad11ad0e24949c7"} Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.128195 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.133248 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" exitCode=0 Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.133292 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b"} Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.133332 4764 scope.go:117] "RemoveContainer" containerID="68c979878c95f8640cce7cc5379dae1c33588f12c07a9605e16e2a51f711a2c5" Dec 04 02:09:21 crc kubenswrapper[4764]: I1204 02:09:21.134292 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:09:21 crc kubenswrapper[4764]: E1204 02:09:21.134785 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:09:22 crc kubenswrapper[4764]: I1204 02:09:22.152886 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerStarted","Data":"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77"} Dec 04 02:09:24 crc kubenswrapper[4764]: I1204 02:09:24.181259 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerID="89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77" exitCode=0 Dec 04 02:09:24 crc kubenswrapper[4764]: I1204 02:09:24.181295 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerDied","Data":"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77"} Dec 04 02:09:26 crc kubenswrapper[4764]: I1204 02:09:26.206947 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerStarted","Data":"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e"} Dec 04 02:09:26 crc kubenswrapper[4764]: I1204 02:09:26.237712 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ln6qq" podStartSLOduration=2.965726341 podStartE2EDuration="7.237691913s" podCreationTimestamp="2025-12-04 02:09:19 +0000 UTC" firstStartedPulling="2025-12-04 02:09:21.127961252 +0000 UTC m=+8896.889285663" lastFinishedPulling="2025-12-04 02:09:25.399926794 +0000 UTC m=+8901.161251235" observedRunningTime="2025-12-04 02:09:26.232231039 +0000 UTC m=+8901.993555460" watchObservedRunningTime="2025-12-04 02:09:26.237691913 +0000 UTC m=+8901.999016334" Dec 04 02:09:29 crc kubenswrapper[4764]: I1204 02:09:29.789516 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:29 crc kubenswrapper[4764]: I1204 02:09:29.790758 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:30 crc kubenswrapper[4764]: I1204 02:09:30.871741 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ln6qq" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="registry-server" probeResult="failure" output=< Dec 04 02:09:30 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 02:09:30 crc kubenswrapper[4764]: > Dec 04 02:09:32 crc kubenswrapper[4764]: I1204 02:09:32.546804 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:09:32 crc kubenswrapper[4764]: E1204 02:09:32.547485 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:09:39 crc kubenswrapper[4764]: I1204 02:09:39.851971 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:39 crc kubenswrapper[4764]: I1204 02:09:39.924509 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:40 crc kubenswrapper[4764]: I1204 02:09:40.105879 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:41 crc kubenswrapper[4764]: I1204 02:09:41.389893 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ln6qq" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="registry-server" containerID="cri-o://9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e" gracePeriod=2 Dec 04 02:09:41 crc kubenswrapper[4764]: I1204 02:09:41.878573 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.041676 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities\") pod \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.042477 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pcfg\" (UniqueName: \"kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg\") pod \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.042684 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content\") pod \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\" (UID: \"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a\") " Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.043529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities" (OuterVolumeSpecName: "utilities") pod "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" (UID: "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.054018 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg" (OuterVolumeSpecName: "kube-api-access-8pcfg") pod "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" (UID: "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a"). InnerVolumeSpecName "kube-api-access-8pcfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.145982 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.146016 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pcfg\" (UniqueName: \"kubernetes.io/projected/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-kube-api-access-8pcfg\") on node \"crc\" DevicePath \"\"" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.162404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" (UID: "fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.248440 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.408575 4764 generic.go:334] "Generic (PLEG): container finished" podID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerID="9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e" exitCode=0 Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.408637 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerDied","Data":"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e"} Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.408663 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ln6qq" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.408688 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ln6qq" event={"ID":"fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a","Type":"ContainerDied","Data":"7988a576ca1fbb8c9c569bbb4e2455955a3db8704335a1f02ad11ad0e24949c7"} Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.408792 4764 scope.go:117] "RemoveContainer" containerID="9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.454903 4764 scope.go:117] "RemoveContainer" containerID="89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.462765 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.478157 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ln6qq"] Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.497506 4764 scope.go:117] "RemoveContainer" containerID="4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.550248 4764 scope.go:117] "RemoveContainer" containerID="9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e" Dec 04 02:09:42 crc kubenswrapper[4764]: E1204 02:09:42.550657 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e\": container with ID starting with 9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e not found: ID does not exist" containerID="9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.550687 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e"} err="failed to get container status \"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e\": rpc error: code = NotFound desc = could not find container \"9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e\": container with ID starting with 9845651782f14b868db2fcd2ae1da94fc30659b06bbf6731dc0e5ba5a946a01e not found: ID does not exist" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.550707 4764 scope.go:117] "RemoveContainer" containerID="89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77" Dec 04 02:09:42 crc kubenswrapper[4764]: E1204 02:09:42.551045 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77\": container with ID starting with 89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77 not found: ID does not exist" containerID="89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.551084 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77"} err="failed to get container status \"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77\": rpc error: code = NotFound desc = could not find container \"89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77\": container with ID starting with 89703c43cd4a03fb38c45afd2e11e23e28ea9ad9506fa760d5864d8885daca77 not found: ID does not exist" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.551107 4764 scope.go:117] "RemoveContainer" containerID="4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146" Dec 04 02:09:42 crc kubenswrapper[4764]: E1204 02:09:42.551337 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146\": container with ID starting with 4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146 not found: ID does not exist" containerID="4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.551362 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146"} err="failed to get container status \"4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146\": rpc error: code = NotFound desc = could not find container \"4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146\": container with ID starting with 4f3388aa437d0c7f2047df7edc13cd8d377f2e456d4e40012a0c241c0b28a146 not found: ID does not exist" Dec 04 02:09:42 crc kubenswrapper[4764]: I1204 02:09:42.557916 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" path="/var/lib/kubelet/pods/fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a/volumes" Dec 04 02:09:43 crc kubenswrapper[4764]: I1204 02:09:43.547246 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:09:43 crc kubenswrapper[4764]: E1204 02:09:43.548209 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:09:54 crc kubenswrapper[4764]: I1204 02:09:54.589848 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:09:54 crc kubenswrapper[4764]: E1204 02:09:54.593370 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:10:07 crc kubenswrapper[4764]: I1204 02:10:07.545454 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:10:07 crc kubenswrapper[4764]: E1204 02:10:07.546376 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:10:22 crc kubenswrapper[4764]: I1204 02:10:22.546492 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:10:22 crc kubenswrapper[4764]: E1204 02:10:22.548627 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:10:37 crc kubenswrapper[4764]: I1204 02:10:37.547418 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:10:37 crc kubenswrapper[4764]: E1204 02:10:37.548951 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:10:49 crc kubenswrapper[4764]: I1204 02:10:49.546628 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:10:49 crc kubenswrapper[4764]: E1204 02:10:49.547653 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:03 crc kubenswrapper[4764]: I1204 02:11:03.545822 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:11:03 crc kubenswrapper[4764]: E1204 02:11:03.546885 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:14 crc kubenswrapper[4764]: I1204 02:11:14.559408 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:11:14 crc kubenswrapper[4764]: E1204 02:11:14.560365 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:27 crc kubenswrapper[4764]: I1204 02:11:27.551063 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:11:27 crc kubenswrapper[4764]: E1204 02:11:27.552207 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:40 crc kubenswrapper[4764]: I1204 02:11:40.545852 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:11:40 crc kubenswrapper[4764]: E1204 02:11:40.548258 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.676990 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:11:49 crc kubenswrapper[4764]: E1204 02:11:49.678151 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="extract-utilities" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.678170 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="extract-utilities" Dec 04 02:11:49 crc kubenswrapper[4764]: E1204 02:11:49.678195 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="registry-server" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.678203 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="registry-server" Dec 04 02:11:49 crc kubenswrapper[4764]: E1204 02:11:49.678243 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="extract-content" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.678251 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="extract-content" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.678672 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc9365ee-0a0b-4cb8-935a-4dbd49c3ab7a" containerName="registry-server" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.680685 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.699421 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.712575 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrqg\" (UniqueName: \"kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.712798 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.713250 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.814777 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcrqg\" (UniqueName: \"kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.814841 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.814941 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.815577 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.815825 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:49 crc kubenswrapper[4764]: I1204 02:11:49.844898 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcrqg\" (UniqueName: \"kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg\") pod \"community-operators-l94jw\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:50 crc kubenswrapper[4764]: I1204 02:11:50.040952 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:11:50 crc kubenswrapper[4764]: I1204 02:11:50.588059 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:11:51 crc kubenswrapper[4764]: I1204 02:11:51.008312 4764 generic.go:334] "Generic (PLEG): container finished" podID="3884c21c-34e9-4554-b2d0-010105136d1d" containerID="b7fa36b1e9862281e28f0713a61c81bbbb5249358a354994d6037b374f0b9887" exitCode=0 Dec 04 02:11:51 crc kubenswrapper[4764]: I1204 02:11:51.008537 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerDied","Data":"b7fa36b1e9862281e28f0713a61c81bbbb5249358a354994d6037b374f0b9887"} Dec 04 02:11:51 crc kubenswrapper[4764]: I1204 02:11:51.008750 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerStarted","Data":"d83c3fc5bda4089e3e9d31e828b80eecea4394aca387cbef86822b6deb7fa1a7"} Dec 04 02:11:52 crc kubenswrapper[4764]: I1204 02:11:52.022827 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerStarted","Data":"8788c7f3520838bea379ba9ad7b7a649d0e850950f76e91778b93ff2b56151a3"} Dec 04 02:11:53 crc kubenswrapper[4764]: I1204 02:11:53.033917 4764 generic.go:334] "Generic (PLEG): container finished" podID="3884c21c-34e9-4554-b2d0-010105136d1d" containerID="8788c7f3520838bea379ba9ad7b7a649d0e850950f76e91778b93ff2b56151a3" exitCode=0 Dec 04 02:11:53 crc kubenswrapper[4764]: I1204 02:11:53.034004 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerDied","Data":"8788c7f3520838bea379ba9ad7b7a649d0e850950f76e91778b93ff2b56151a3"} Dec 04 02:11:53 crc kubenswrapper[4764]: I1204 02:11:53.547159 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:11:53 crc kubenswrapper[4764]: E1204 02:11:53.547485 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:11:54 crc kubenswrapper[4764]: I1204 02:11:54.051218 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerStarted","Data":"8630fda94f8b11d9ab5027e09a18c39e24627ef50f82c7c5e77edec64faa6826"} Dec 04 02:11:54 crc kubenswrapper[4764]: I1204 02:11:54.080056 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l94jw" podStartSLOduration=2.684544596 podStartE2EDuration="5.080037443s" podCreationTimestamp="2025-12-04 02:11:49 +0000 UTC" firstStartedPulling="2025-12-04 02:11:51.011182762 +0000 UTC m=+9046.772507183" lastFinishedPulling="2025-12-04 02:11:53.406675619 +0000 UTC m=+9049.168000030" observedRunningTime="2025-12-04 02:11:54.074153418 +0000 UTC m=+9049.835477849" watchObservedRunningTime="2025-12-04 02:11:54.080037443 +0000 UTC m=+9049.841361854" Dec 04 02:12:00 crc kubenswrapper[4764]: I1204 02:12:00.041663 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:00 crc kubenswrapper[4764]: I1204 02:12:00.042150 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:00 crc kubenswrapper[4764]: I1204 02:12:00.091281 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:00 crc kubenswrapper[4764]: I1204 02:12:00.169466 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:00 crc kubenswrapper[4764]: I1204 02:12:00.344315 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:12:02 crc kubenswrapper[4764]: I1204 02:12:02.130174 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l94jw" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="registry-server" containerID="cri-o://8630fda94f8b11d9ab5027e09a18c39e24627ef50f82c7c5e77edec64faa6826" gracePeriod=2 Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.144175 4764 generic.go:334] "Generic (PLEG): container finished" podID="3884c21c-34e9-4554-b2d0-010105136d1d" containerID="8630fda94f8b11d9ab5027e09a18c39e24627ef50f82c7c5e77edec64faa6826" exitCode=0 Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.144246 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerDied","Data":"8630fda94f8b11d9ab5027e09a18c39e24627ef50f82c7c5e77edec64faa6826"} Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.607744 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.794166 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities\") pod \"3884c21c-34e9-4554-b2d0-010105136d1d\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.794539 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcrqg\" (UniqueName: \"kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg\") pod \"3884c21c-34e9-4554-b2d0-010105136d1d\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.794599 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content\") pod \"3884c21c-34e9-4554-b2d0-010105136d1d\" (UID: \"3884c21c-34e9-4554-b2d0-010105136d1d\") " Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.795089 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities" (OuterVolumeSpecName: "utilities") pod "3884c21c-34e9-4554-b2d0-010105136d1d" (UID: "3884c21c-34e9-4554-b2d0-010105136d1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.795576 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.800021 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg" (OuterVolumeSpecName: "kube-api-access-rcrqg") pod "3884c21c-34e9-4554-b2d0-010105136d1d" (UID: "3884c21c-34e9-4554-b2d0-010105136d1d"). InnerVolumeSpecName "kube-api-access-rcrqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.842399 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3884c21c-34e9-4554-b2d0-010105136d1d" (UID: "3884c21c-34e9-4554-b2d0-010105136d1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.897536 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcrqg\" (UniqueName: \"kubernetes.io/projected/3884c21c-34e9-4554-b2d0-010105136d1d-kube-api-access-rcrqg\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:03 crc kubenswrapper[4764]: I1204 02:12:03.897577 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3884c21c-34e9-4554-b2d0-010105136d1d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.160092 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l94jw" event={"ID":"3884c21c-34e9-4554-b2d0-010105136d1d","Type":"ContainerDied","Data":"d83c3fc5bda4089e3e9d31e828b80eecea4394aca387cbef86822b6deb7fa1a7"} Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.160164 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l94jw" Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.160172 4764 scope.go:117] "RemoveContainer" containerID="8630fda94f8b11d9ab5027e09a18c39e24627ef50f82c7c5e77edec64faa6826" Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.201758 4764 scope.go:117] "RemoveContainer" containerID="8788c7f3520838bea379ba9ad7b7a649d0e850950f76e91778b93ff2b56151a3" Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.231835 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.244621 4764 scope.go:117] "RemoveContainer" containerID="b7fa36b1e9862281e28f0713a61c81bbbb5249358a354994d6037b374f0b9887" Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.246600 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l94jw"] Dec 04 02:12:04 crc kubenswrapper[4764]: I1204 02:12:04.567801 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" path="/var/lib/kubelet/pods/3884c21c-34e9-4554-b2d0-010105136d1d/volumes" Dec 04 02:12:06 crc kubenswrapper[4764]: I1204 02:12:06.546657 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:12:06 crc kubenswrapper[4764]: E1204 02:12:06.547704 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:12:20 crc kubenswrapper[4764]: I1204 02:12:20.545868 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:12:20 crc kubenswrapper[4764]: E1204 02:12:20.546826 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:12:35 crc kubenswrapper[4764]: I1204 02:12:35.546820 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:12:35 crc kubenswrapper[4764]: E1204 02:12:35.548958 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:12:42 crc kubenswrapper[4764]: I1204 02:12:42.639824 4764 generic.go:334] "Generic (PLEG): container finished" podID="054cf6c8-5222-4ade-a2f3-53aebec044b4" containerID="32e42436d1bd1c0e5c0a5eaf191d2f527cf42a0fddc249b49d827faed76681c0" exitCode=0 Dec 04 02:12:42 crc kubenswrapper[4764]: I1204 02:12:42.639867 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" event={"ID":"054cf6c8-5222-4ade-a2f3-53aebec044b4","Type":"ContainerDied","Data":"32e42436d1bd1c0e5c0a5eaf191d2f527cf42a0fddc249b49d827faed76681c0"} Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.228779 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358168 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358250 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358350 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358385 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358471 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.358501 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94v97\" (UniqueName: \"kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97\") pod \"054cf6c8-5222-4ade-a2f3-53aebec044b4\" (UID: \"054cf6c8-5222-4ade-a2f3-53aebec044b4\") " Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.364562 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph" (OuterVolumeSpecName: "ceph") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.364846 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97" (OuterVolumeSpecName: "kube-api-access-94v97") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "kube-api-access-94v97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.365459 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.392046 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.396388 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.418590 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory" (OuterVolumeSpecName: "inventory") pod "054cf6c8-5222-4ade-a2f3-53aebec044b4" (UID: "054cf6c8-5222-4ade-a2f3-53aebec044b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461078 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94v97\" (UniqueName: \"kubernetes.io/projected/054cf6c8-5222-4ade-a2f3-53aebec044b4-kube-api-access-94v97\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461133 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461155 4764 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461175 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461193 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.461208 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/054cf6c8-5222-4ade-a2f3-53aebec044b4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.663348 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" event={"ID":"054cf6c8-5222-4ade-a2f3-53aebec044b4","Type":"ContainerDied","Data":"48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59"} Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.663393 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48948b7c8e84c4eed3eac81569c1c832e954f1fedee7e38fdd07eff1b6aa0c59" Dec 04 02:12:44 crc kubenswrapper[4764]: I1204 02:12:44.663610 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nj9tw" Dec 04 02:12:47 crc kubenswrapper[4764]: I1204 02:12:47.545301 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:12:47 crc kubenswrapper[4764]: E1204 02:12:47.546141 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:12:55 crc kubenswrapper[4764]: I1204 02:12:55.425899 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:55 crc kubenswrapper[4764]: I1204 02:12:55.426963 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7df9bb94-13e8-4250-8538-f55f68e1d29a" containerName="nova-cell0-conductor-conductor" containerID="cri-o://aa1748e5f32efac2776cd6a627bca59715d373132528426aee5f82631a52ed7e" gracePeriod=30 Dec 04 02:12:55 crc kubenswrapper[4764]: I1204 02:12:55.970779 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:55 crc kubenswrapper[4764]: I1204 02:12:55.977944 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.092573 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.093081 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-log" containerID="cri-o://1363f1b0274e1dc39fa5a3395613029631387b7821bc5c5ae39e647dd1436ff4" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.093151 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-api" containerID="cri-o://6c943e4231f493c8dc35fdfb95b9753bb224e0fc35cc27d4ee80ba8bf459da59" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.105242 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.105455 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerName="nova-scheduler-scheduler" containerID="cri-o://ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: E1204 02:12:56.142185 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 02:12:56 crc kubenswrapper[4764]: E1204 02:12:56.143699 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 02:12:56 crc kubenswrapper[4764]: E1204 02:12:56.145184 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 04 02:12:56 crc kubenswrapper[4764]: E1204 02:12:56.145225 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerName="nova-cell1-conductor-conductor" Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.149579 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.149824 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" containerID="cri-o://ca95ba055a32a0d428208d646911fb0196920bf9994bf72f74317d6dbe7568e6" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.150278 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" containerID="cri-o://151e508b584402b282c414977cdbc9ef76da2ee3166f08c5ee1d0437cda46176" gracePeriod=30 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.805004 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerID="1363f1b0274e1dc39fa5a3395613029631387b7821bc5c5ae39e647dd1436ff4" exitCode=143 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.805077 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerDied","Data":"1363f1b0274e1dc39fa5a3395613029631387b7821bc5c5ae39e647dd1436ff4"} Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.809366 4764 generic.go:334] "Generic (PLEG): container finished" podID="7df9bb94-13e8-4250-8538-f55f68e1d29a" containerID="aa1748e5f32efac2776cd6a627bca59715d373132528426aee5f82631a52ed7e" exitCode=0 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.809464 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7df9bb94-13e8-4250-8538-f55f68e1d29a","Type":"ContainerDied","Data":"aa1748e5f32efac2776cd6a627bca59715d373132528426aee5f82631a52ed7e"} Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.811164 4764 generic.go:334] "Generic (PLEG): container finished" podID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerID="ca95ba055a32a0d428208d646911fb0196920bf9994bf72f74317d6dbe7568e6" exitCode=143 Dec 04 02:12:56 crc kubenswrapper[4764]: I1204 02:12:56.811192 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerDied","Data":"ca95ba055a32a0d428208d646911fb0196920bf9994bf72f74317d6dbe7568e6"} Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.179011 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.242355 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.243685 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.247901 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b4bm\" (UniqueName: \"kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm\") pod \"7df9bb94-13e8-4250-8538-f55f68e1d29a\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.248146 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle\") pod \"7df9bb94-13e8-4250-8538-f55f68e1d29a\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.248177 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data\") pod \"7df9bb94-13e8-4250-8538-f55f68e1d29a\" (UID: \"7df9bb94-13e8-4250-8538-f55f68e1d29a\") " Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.248967 4764 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.249030 4764 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerName="nova-scheduler-scheduler" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.255981 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm" (OuterVolumeSpecName: "kube-api-access-8b4bm") pod "7df9bb94-13e8-4250-8538-f55f68e1d29a" (UID: "7df9bb94-13e8-4250-8538-f55f68e1d29a"). InnerVolumeSpecName "kube-api-access-8b4bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.283412 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data" (OuterVolumeSpecName: "config-data") pod "7df9bb94-13e8-4250-8538-f55f68e1d29a" (UID: "7df9bb94-13e8-4250-8538-f55f68e1d29a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.283929 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7df9bb94-13e8-4250-8538-f55f68e1d29a" (UID: "7df9bb94-13e8-4250-8538-f55f68e1d29a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.350778 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.350810 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7df9bb94-13e8-4250-8538-f55f68e1d29a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.350821 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b4bm\" (UniqueName: \"kubernetes.io/projected/7df9bb94-13e8-4250-8538-f55f68e1d29a-kube-api-access-8b4bm\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.744940 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.826994 4764 generic.go:334] "Generic (PLEG): container finished" podID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" exitCode=0 Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.827052 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85b4c080-751d-41bd-8c06-3982fe210fc5","Type":"ContainerDied","Data":"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349"} Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.827078 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85b4c080-751d-41bd-8c06-3982fe210fc5","Type":"ContainerDied","Data":"545e220b98e947dc1a29d784dfe57a7b04f5680f4e3772195816af9761ecc2dd"} Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.827094 4764 scope.go:117] "RemoveContainer" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.827204 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.831493 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7df9bb94-13e8-4250-8538-f55f68e1d29a","Type":"ContainerDied","Data":"a7ca49fde1d08ea109aef0d1a932d4a85e77a8c83440118df0ce0c7944d2d09c"} Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.831598 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.866472 4764 scope.go:117] "RemoveContainer" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.866917 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349\": container with ID starting with d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349 not found: ID does not exist" containerID="d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.866949 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349"} err="failed to get container status \"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349\": rpc error: code = NotFound desc = could not find container \"d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349\": container with ID starting with d7252aadcc204a1fdbf8c4e20df7cc15e6fc6636fdaa592f36ac34859a7e6349 not found: ID does not exist" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.866972 4764 scope.go:117] "RemoveContainer" containerID="aa1748e5f32efac2776cd6a627bca59715d373132528426aee5f82631a52ed7e" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.867644 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxjg\" (UniqueName: \"kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg\") pod \"85b4c080-751d-41bd-8c06-3982fe210fc5\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.867792 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.867880 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle\") pod \"85b4c080-751d-41bd-8c06-3982fe210fc5\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.867924 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data\") pod \"85b4c080-751d-41bd-8c06-3982fe210fc5\" (UID: \"85b4c080-751d-41bd-8c06-3982fe210fc5\") " Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.883679 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg" (OuterVolumeSpecName: "kube-api-access-pgxjg") pod "85b4c080-751d-41bd-8c06-3982fe210fc5" (UID: "85b4c080-751d-41bd-8c06-3982fe210fc5"). InnerVolumeSpecName "kube-api-access-pgxjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.892702 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910257 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910687 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df9bb94-13e8-4250-8538-f55f68e1d29a" containerName="nova-cell0-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910705 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df9bb94-13e8-4250-8538-f55f68e1d29a" containerName="nova-cell0-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910861 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054cf6c8-5222-4ade-a2f3-53aebec044b4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910874 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="054cf6c8-5222-4ade-a2f3-53aebec044b4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910891 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerName="nova-cell1-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910899 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerName="nova-cell1-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910917 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="extract-utilities" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910923 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="extract-utilities" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910935 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="extract-content" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910943 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="extract-content" Dec 04 02:12:57 crc kubenswrapper[4764]: E1204 02:12:57.910951 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="registry-server" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.910957 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="registry-server" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.911148 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df9bb94-13e8-4250-8538-f55f68e1d29a" containerName="nova-cell0-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.911170 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="054cf6c8-5222-4ade-a2f3-53aebec044b4" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.911185 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="3884c21c-34e9-4554-b2d0-010105136d1d" containerName="registry-server" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.911193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" containerName="nova-cell1-conductor-conductor" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.911962 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.914270 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.927224 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.929033 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data" (OuterVolumeSpecName: "config-data") pod "85b4c080-751d-41bd-8c06-3982fe210fc5" (UID: "85b4c080-751d-41bd-8c06-3982fe210fc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.942930 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85b4c080-751d-41bd-8c06-3982fe210fc5" (UID: "85b4c080-751d-41bd-8c06-3982fe210fc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.970875 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.970935 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfc6f\" (UniqueName: \"kubernetes.io/projected/35a6efa2-f727-4605-8a99-4b8fc183e2cb-kube-api-access-tfc6f\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.971126 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.971192 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.971204 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85b4c080-751d-41bd-8c06-3982fe210fc5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:57 crc kubenswrapper[4764]: I1204 02:12:57.971212 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgxjg\" (UniqueName: \"kubernetes.io/projected/85b4c080-751d-41bd-8c06-3982fe210fc5-kube-api-access-pgxjg\") on node \"crc\" DevicePath \"\"" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.073828 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.074161 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfc6f\" (UniqueName: \"kubernetes.io/projected/35a6efa2-f727-4605-8a99-4b8fc183e2cb-kube-api-access-tfc6f\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.074407 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.080327 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.080927 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a6efa2-f727-4605-8a99-4b8fc183e2cb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.092516 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfc6f\" (UniqueName: \"kubernetes.io/projected/35a6efa2-f727-4605-8a99-4b8fc183e2cb-kube-api-access-tfc6f\") pod \"nova-cell0-conductor-0\" (UID: \"35a6efa2-f727-4605-8a99-4b8fc183e2cb\") " pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.183151 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.199121 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.212342 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.213881 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.216693 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.222762 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.278328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.278405 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrl6s\" (UniqueName: \"kubernetes.io/projected/524a45cb-f83d-4c0b-b05e-76399a5224eb-kube-api-access-qrl6s\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.278435 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.279779 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.380834 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.380947 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrl6s\" (UniqueName: \"kubernetes.io/projected/524a45cb-f83d-4c0b-b05e-76399a5224eb-kube-api-access-qrl6s\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.380994 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.388354 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.390698 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/524a45cb-f83d-4c0b-b05e-76399a5224eb-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.421445 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrl6s\" (UniqueName: \"kubernetes.io/projected/524a45cb-f83d-4c0b-b05e-76399a5224eb-kube-api-access-qrl6s\") pod \"nova-cell1-conductor-0\" (UID: \"524a45cb-f83d-4c0b-b05e-76399a5224eb\") " pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.559230 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.562804 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df9bb94-13e8-4250-8538-f55f68e1d29a" path="/var/lib/kubelet/pods/7df9bb94-13e8-4250-8538-f55f68e1d29a/volumes" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.563760 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b4c080-751d-41bd-8c06-3982fe210fc5" path="/var/lib/kubelet/pods/85b4c080-751d-41bd-8c06-3982fe210fc5/volumes" Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.749669 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 02:12:58 crc kubenswrapper[4764]: W1204 02:12:58.757323 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35a6efa2_f727_4605_8a99_4b8fc183e2cb.slice/crio-6875fe07c7d52682f84a0a8a83f3d8bf1837a31068ff9edd2aecc5afcb46506c WatchSource:0}: Error finding container 6875fe07c7d52682f84a0a8a83f3d8bf1837a31068ff9edd2aecc5afcb46506c: Status 404 returned error can't find the container with id 6875fe07c7d52682f84a0a8a83f3d8bf1837a31068ff9edd2aecc5afcb46506c Dec 04 02:12:58 crc kubenswrapper[4764]: I1204 02:12:58.848323 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"35a6efa2-f727-4605-8a99-4b8fc183e2cb","Type":"ContainerStarted","Data":"6875fe07c7d52682f84a0a8a83f3d8bf1837a31068ff9edd2aecc5afcb46506c"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.000694 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 02:12:59 crc kubenswrapper[4764]: W1204 02:12:59.012664 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod524a45cb_f83d_4c0b_b05e_76399a5224eb.slice/crio-9c648e51f1c3b984167843cdf14a996ad00543f0857d7e99d9f7416db4d4690b WatchSource:0}: Error finding container 9c648e51f1c3b984167843cdf14a996ad00543f0857d7e99d9f7416db4d4690b: Status 404 returned error can't find the container with id 9c648e51f1c3b984167843cdf14a996ad00543f0857d7e99d9f7416db4d4690b Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.716357 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:58422->10.217.1.83:8775: read: connection reset by peer" Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.716402 4764 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.83:8775/\": read tcp 10.217.0.2:58426->10.217.1.83:8775: read: connection reset by peer" Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.875769 4764 generic.go:334] "Generic (PLEG): container finished" podID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerID="6c943e4231f493c8dc35fdfb95b9753bb224e0fc35cc27d4ee80ba8bf459da59" exitCode=0 Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.875844 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerDied","Data":"6c943e4231f493c8dc35fdfb95b9753bb224e0fc35cc27d4ee80ba8bf459da59"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.879767 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"524a45cb-f83d-4c0b-b05e-76399a5224eb","Type":"ContainerStarted","Data":"c88fdc9a799f4ecfef96dce7a91facacfe309667f860282c5b2346d619aa445c"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.879803 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"524a45cb-f83d-4c0b-b05e-76399a5224eb","Type":"ContainerStarted","Data":"9c648e51f1c3b984167843cdf14a996ad00543f0857d7e99d9f7416db4d4690b"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.881048 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.883958 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"35a6efa2-f727-4605-8a99-4b8fc183e2cb","Type":"ContainerStarted","Data":"d9e8b30311876c1bb25a4f3c8f4b0fad207a3528e40a162852578097fe777506"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.884845 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.887043 4764 generic.go:334] "Generic (PLEG): container finished" podID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerID="151e508b584402b282c414977cdbc9ef76da2ee3166f08c5ee1d0437cda46176" exitCode=0 Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.887074 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerDied","Data":"151e508b584402b282c414977cdbc9ef76da2ee3166f08c5ee1d0437cda46176"} Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.899780 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.899759518 podStartE2EDuration="1.899759518s" podCreationTimestamp="2025-12-04 02:12:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:12:59.895125035 +0000 UTC m=+9115.656449446" watchObservedRunningTime="2025-12-04 02:12:59.899759518 +0000 UTC m=+9115.661083929" Dec 04 02:12:59 crc kubenswrapper[4764]: I1204 02:12:59.923135 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.923116502 podStartE2EDuration="2.923116502s" podCreationTimestamp="2025-12-04 02:12:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:12:59.911157248 +0000 UTC m=+9115.672481659" watchObservedRunningTime="2025-12-04 02:12:59.923116502 +0000 UTC m=+9115.684440913" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.034962 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.124154 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data\") pod \"ab096a5c-73c5-409e-a765-00ccd00123d4\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.124545 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mllnf\" (UniqueName: \"kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf\") pod \"ab096a5c-73c5-409e-a765-00ccd00123d4\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.124580 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs\") pod \"ab096a5c-73c5-409e-a765-00ccd00123d4\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.124605 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle\") pod \"ab096a5c-73c5-409e-a765-00ccd00123d4\" (UID: \"ab096a5c-73c5-409e-a765-00ccd00123d4\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.128620 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs" (OuterVolumeSpecName: "logs") pod "ab096a5c-73c5-409e-a765-00ccd00123d4" (UID: "ab096a5c-73c5-409e-a765-00ccd00123d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.135633 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf" (OuterVolumeSpecName: "kube-api-access-mllnf") pod "ab096a5c-73c5-409e-a765-00ccd00123d4" (UID: "ab096a5c-73c5-409e-a765-00ccd00123d4"). InnerVolumeSpecName "kube-api-access-mllnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.153709 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.193766 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab096a5c-73c5-409e-a765-00ccd00123d4" (UID: "ab096a5c-73c5-409e-a765-00ccd00123d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.202576 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data" (OuterVolumeSpecName: "config-data") pod "ab096a5c-73c5-409e-a765-00ccd00123d4" (UID: "ab096a5c-73c5-409e-a765-00ccd00123d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.226803 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dt2d\" (UniqueName: \"kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d\") pod \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.226879 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data\") pod \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227029 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle\") pod \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227074 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs\") pod \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\" (UID: \"9a62a7a6-2d10-4577-9c35-dc3ecc032237\") " Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227572 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mllnf\" (UniqueName: \"kubernetes.io/projected/ab096a5c-73c5-409e-a765-00ccd00123d4-kube-api-access-mllnf\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227588 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab096a5c-73c5-409e-a765-00ccd00123d4-logs\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227599 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227607 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab096a5c-73c5-409e-a765-00ccd00123d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.227692 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs" (OuterVolumeSpecName: "logs") pod "9a62a7a6-2d10-4577-9c35-dc3ecc032237" (UID: "9a62a7a6-2d10-4577-9c35-dc3ecc032237"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.230557 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d" (OuterVolumeSpecName: "kube-api-access-6dt2d") pod "9a62a7a6-2d10-4577-9c35-dc3ecc032237" (UID: "9a62a7a6-2d10-4577-9c35-dc3ecc032237"). InnerVolumeSpecName "kube-api-access-6dt2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.265603 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data" (OuterVolumeSpecName: "config-data") pod "9a62a7a6-2d10-4577-9c35-dc3ecc032237" (UID: "9a62a7a6-2d10-4577-9c35-dc3ecc032237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.270213 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a62a7a6-2d10-4577-9c35-dc3ecc032237" (UID: "9a62a7a6-2d10-4577-9c35-dc3ecc032237"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.329899 4764 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a62a7a6-2d10-4577-9c35-dc3ecc032237-logs\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.329931 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dt2d\" (UniqueName: \"kubernetes.io/projected/9a62a7a6-2d10-4577-9c35-dc3ecc032237-kube-api-access-6dt2d\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.329944 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.329954 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62a7a6-2d10-4577-9c35-dc3ecc032237-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.545947 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:13:00 crc kubenswrapper[4764]: E1204 02:13:00.546263 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.897631 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ab096a5c-73c5-409e-a765-00ccd00123d4","Type":"ContainerDied","Data":"a53a7cc2b7bf2d7ebc8df087ea3b8dab5f4725270c579e624d48ec0efe96afce"} Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.898618 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.898908 4764 scope.go:117] "RemoveContainer" containerID="6c943e4231f493c8dc35fdfb95b9753bb224e0fc35cc27d4ee80ba8bf459da59" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.901217 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 02:13:00 crc kubenswrapper[4764]: I1204 02:13:00.901872 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9a62a7a6-2d10-4577-9c35-dc3ecc032237","Type":"ContainerDied","Data":"1cb1573f63a32bc60fc2d422c8ab33526ccb30c4a9b13840d6665a0169d7da17"} Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.048588 4764 scope.go:117] "RemoveContainer" containerID="1363f1b0274e1dc39fa5a3395613029631387b7821bc5c5ae39e647dd1436ff4" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.082692 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.103380 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.114663 4764 scope.go:117] "RemoveContainer" containerID="151e508b584402b282c414977cdbc9ef76da2ee3166f08c5ee1d0437cda46176" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.119410 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.134010 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150170 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: E1204 02:13:01.150691 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-api" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150703 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-api" Dec 04 02:13:01 crc kubenswrapper[4764]: E1204 02:13:01.150713 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150732 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" Dec 04 02:13:01 crc kubenswrapper[4764]: E1204 02:13:01.150766 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-log" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150774 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-log" Dec 04 02:13:01 crc kubenswrapper[4764]: E1204 02:13:01.150793 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150800 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.150993 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-log" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.151004 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" containerName="nova-metadata-metadata" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.151024 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-api" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.151035 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" containerName="nova-api-log" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.152192 4764 scope.go:117] "RemoveContainer" containerID="ca95ba055a32a0d428208d646911fb0196920bf9994bf72f74317d6dbe7568e6" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.152237 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.157480 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.166470 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.194432 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.196952 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.204118 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.211013 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.259769 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.259921 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260072 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5f8cf6-c3c7-4e35-8690-addc575935e9-logs\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260175 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtq2\" (UniqueName: \"kubernetes.io/projected/6ea57de4-69ee-4328-a251-b2cd08c64c7b-kube-api-access-rbtq2\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260353 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpdcl\" (UniqueName: \"kubernetes.io/projected/9d5f8cf6-c3c7-4e35-8690-addc575935e9-kube-api-access-mpdcl\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260643 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea57de4-69ee-4328-a251-b2cd08c64c7b-logs\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260741 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-config-data\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.260999 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-config-data\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.362912 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea57de4-69ee-4328-a251-b2cd08c64c7b-logs\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.362965 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-config-data\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363037 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-config-data\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363085 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363115 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363138 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5f8cf6-c3c7-4e35-8690-addc575935e9-logs\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363168 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtq2\" (UniqueName: \"kubernetes.io/projected/6ea57de4-69ee-4328-a251-b2cd08c64c7b-kube-api-access-rbtq2\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.363255 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpdcl\" (UniqueName: \"kubernetes.io/projected/9d5f8cf6-c3c7-4e35-8690-addc575935e9-kube-api-access-mpdcl\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.364569 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ea57de4-69ee-4328-a251-b2cd08c64c7b-logs\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.364986 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5f8cf6-c3c7-4e35-8690-addc575935e9-logs\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.369289 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.369606 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-config-data\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.370072 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5f8cf6-c3c7-4e35-8690-addc575935e9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.373388 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ea57de4-69ee-4328-a251-b2cd08c64c7b-config-data\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.383643 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.384566 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtq2\" (UniqueName: \"kubernetes.io/projected/6ea57de4-69ee-4328-a251-b2cd08c64c7b-kube-api-access-rbtq2\") pod \"nova-metadata-0\" (UID: \"6ea57de4-69ee-4328-a251-b2cd08c64c7b\") " pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.386081 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpdcl\" (UniqueName: \"kubernetes.io/projected/9d5f8cf6-c3c7-4e35-8690-addc575935e9-kube-api-access-mpdcl\") pod \"nova-api-0\" (UID: \"9d5f8cf6-c3c7-4e35-8690-addc575935e9\") " pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.464478 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle\") pod \"060031f0-ca89-4cb4-82db-2b095eb44cf0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.464597 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data\") pod \"060031f0-ca89-4cb4-82db-2b095eb44cf0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.464655 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr6kv\" (UniqueName: \"kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv\") pod \"060031f0-ca89-4cb4-82db-2b095eb44cf0\" (UID: \"060031f0-ca89-4cb4-82db-2b095eb44cf0\") " Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.469404 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv" (OuterVolumeSpecName: "kube-api-access-zr6kv") pod "060031f0-ca89-4cb4-82db-2b095eb44cf0" (UID: "060031f0-ca89-4cb4-82db-2b095eb44cf0"). InnerVolumeSpecName "kube-api-access-zr6kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.494145 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "060031f0-ca89-4cb4-82db-2b095eb44cf0" (UID: "060031f0-ca89-4cb4-82db-2b095eb44cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.497115 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data" (OuterVolumeSpecName: "config-data") pod "060031f0-ca89-4cb4-82db-2b095eb44cf0" (UID: "060031f0-ca89-4cb4-82db-2b095eb44cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.497416 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.536495 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.569632 4764 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.569663 4764 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/060031f0-ca89-4cb4-82db-2b095eb44cf0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.569673 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr6kv\" (UniqueName: \"kubernetes.io/projected/060031f0-ca89-4cb4-82db-2b095eb44cf0-kube-api-access-zr6kv\") on node \"crc\" DevicePath \"\"" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.922265 4764 generic.go:334] "Generic (PLEG): container finished" podID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" exitCode=0 Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.922405 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"060031f0-ca89-4cb4-82db-2b095eb44cf0","Type":"ContainerDied","Data":"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7"} Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.922823 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"060031f0-ca89-4cb4-82db-2b095eb44cf0","Type":"ContainerDied","Data":"06baab70f8c4d8028d90a04d424c769470d748dc9bf80acf444cb4a7b635fd8b"} Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.922845 4764 scope.go:117] "RemoveContainer" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.922467 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.950853 4764 scope.go:117] "RemoveContainer" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" Dec 04 02:13:01 crc kubenswrapper[4764]: E1204 02:13:01.951450 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7\": container with ID starting with ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7 not found: ID does not exist" containerID="ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.951503 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7"} err="failed to get container status \"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7\": rpc error: code = NotFound desc = could not find container \"ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7\": container with ID starting with ee9f6b8cb310d4a441f8a44af6fdf5e56e9a3e436a81dd16abeef213f9615ba7 not found: ID does not exist" Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.961457 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.978818 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:13:01 crc kubenswrapper[4764]: I1204 02:13:01.992935 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.006944 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:13:02 crc kubenswrapper[4764]: E1204 02:13:02.014795 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerName="nova-scheduler-scheduler" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.014836 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerName="nova-scheduler-scheduler" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.015264 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" containerName="nova-scheduler-scheduler" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.016227 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.016332 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.018733 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.120058 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 02:13:02 crc kubenswrapper[4764]: W1204 02:13:02.124010 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d5f8cf6_c3c7_4e35_8690_addc575935e9.slice/crio-25775b5fdbb50f417d6c10123062794319a433d4cd359e506da598896a0742df WatchSource:0}: Error finding container 25775b5fdbb50f417d6c10123062794319a433d4cd359e506da598896a0742df: Status 404 returned error can't find the container with id 25775b5fdbb50f417d6c10123062794319a433d4cd359e506da598896a0742df Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.126400 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.126596 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n584s\" (UniqueName: \"kubernetes.io/projected/fa9be15b-1720-4d5f-8bab-13aa095347e9-kube-api-access-n584s\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.126655 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-config-data\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.229120 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n584s\" (UniqueName: \"kubernetes.io/projected/fa9be15b-1720-4d5f-8bab-13aa095347e9-kube-api-access-n584s\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.229182 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-config-data\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.229313 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.235353 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.236429 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa9be15b-1720-4d5f-8bab-13aa095347e9-config-data\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.250364 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n584s\" (UniqueName: \"kubernetes.io/projected/fa9be15b-1720-4d5f-8bab-13aa095347e9-kube-api-access-n584s\") pod \"nova-scheduler-0\" (UID: \"fa9be15b-1720-4d5f-8bab-13aa095347e9\") " pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.335923 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.576490 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060031f0-ca89-4cb4-82db-2b095eb44cf0" path="/var/lib/kubelet/pods/060031f0-ca89-4cb4-82db-2b095eb44cf0/volumes" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.580650 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a62a7a6-2d10-4577-9c35-dc3ecc032237" path="/var/lib/kubelet/pods/9a62a7a6-2d10-4577-9c35-dc3ecc032237/volumes" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.581292 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab096a5c-73c5-409e-a765-00ccd00123d4" path="/var/lib/kubelet/pods/ab096a5c-73c5-409e-a765-00ccd00123d4/volumes" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.820218 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 02:13:02 crc kubenswrapper[4764]: W1204 02:13:02.822914 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa9be15b_1720_4d5f_8bab_13aa095347e9.slice/crio-5115f52207f5e8039548719b5da1a4af3895131aa4a40d47c297893b00cd54ef WatchSource:0}: Error finding container 5115f52207f5e8039548719b5da1a4af3895131aa4a40d47c297893b00cd54ef: Status 404 returned error can't find the container with id 5115f52207f5e8039548719b5da1a4af3895131aa4a40d47c297893b00cd54ef Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.933319 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ea57de4-69ee-4328-a251-b2cd08c64c7b","Type":"ContainerStarted","Data":"7e99ff2cc3dea73764094b815c0418d737ddcf2cdbabb77b5dc019e861287117"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.933587 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ea57de4-69ee-4328-a251-b2cd08c64c7b","Type":"ContainerStarted","Data":"28cd6fabfe8cc2bb9ff74c3aef1903c8fd105f94d10af21fca95762eb5022a84"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.933597 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6ea57de4-69ee-4328-a251-b2cd08c64c7b","Type":"ContainerStarted","Data":"b390ee1898ae7bbe91d7dc1fd28f3b66586ecc13e55be5409ca85485826e3039"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.941614 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d5f8cf6-c3c7-4e35-8690-addc575935e9","Type":"ContainerStarted","Data":"58f6ef5209dc2060f327f925b92a599a5b4cc61e088b2d547cb79be42083ab9c"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.941674 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d5f8cf6-c3c7-4e35-8690-addc575935e9","Type":"ContainerStarted","Data":"e3351040fe1f7626a5e20e50b6388be60c75c3dbb4191ab1be7c01ae412ca3fa"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.941693 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9d5f8cf6-c3c7-4e35-8690-addc575935e9","Type":"ContainerStarted","Data":"25775b5fdbb50f417d6c10123062794319a433d4cd359e506da598896a0742df"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.945131 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa9be15b-1720-4d5f-8bab-13aa095347e9","Type":"ContainerStarted","Data":"5115f52207f5e8039548719b5da1a4af3895131aa4a40d47c297893b00cd54ef"} Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.971281 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.971254174 podStartE2EDuration="1.971254174s" podCreationTimestamp="2025-12-04 02:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:13:02.959535947 +0000 UTC m=+9118.720860358" watchObservedRunningTime="2025-12-04 02:13:02.971254174 +0000 UTC m=+9118.732578595" Dec 04 02:13:02 crc kubenswrapper[4764]: I1204 02:13:02.990593 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9905526180000002 podStartE2EDuration="1.990552618s" podCreationTimestamp="2025-12-04 02:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:13:02.974504684 +0000 UTC m=+9118.735829115" watchObservedRunningTime="2025-12-04 02:13:02.990552618 +0000 UTC m=+9118.751877039" Dec 04 02:13:03 crc kubenswrapper[4764]: I1204 02:13:03.336073 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 02:13:03 crc kubenswrapper[4764]: I1204 02:13:03.960905 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa9be15b-1720-4d5f-8bab-13aa095347e9","Type":"ContainerStarted","Data":"d8a8a6cde792529a0d4c2ad3f1fff7aade98895837efacc9fd6216ec02396370"} Dec 04 02:13:03 crc kubenswrapper[4764]: I1204 02:13:03.988844 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9888232500000003 podStartE2EDuration="2.98882325s" podCreationTimestamp="2025-12-04 02:13:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 02:13:03.9859764 +0000 UTC m=+9119.747300811" watchObservedRunningTime="2025-12-04 02:13:03.98882325 +0000 UTC m=+9119.750147671" Dec 04 02:13:06 crc kubenswrapper[4764]: I1204 02:13:06.498802 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 02:13:06 crc kubenswrapper[4764]: I1204 02:13:06.499535 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 02:13:07 crc kubenswrapper[4764]: I1204 02:13:07.336520 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 02:13:08 crc kubenswrapper[4764]: I1204 02:13:08.612737 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 02:13:11 crc kubenswrapper[4764]: I1204 02:13:11.498518 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 02:13:11 crc kubenswrapper[4764]: I1204 02:13:11.499302 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 02:13:11 crc kubenswrapper[4764]: I1204 02:13:11.537531 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 02:13:11 crc kubenswrapper[4764]: I1204 02:13:11.537609 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.336515 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.387180 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.546895 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:13:12 crc kubenswrapper[4764]: E1204 02:13:12.547106 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.582392 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ea57de4-69ee-4328-a251-b2cd08c64c7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.193:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.664964 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6ea57de4-69ee-4328-a251-b2cd08c64c7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.193:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.665021 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d5f8cf6-c3c7-4e35-8690-addc575935e9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 02:13:12 crc kubenswrapper[4764]: I1204 02:13:12.665157 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9d5f8cf6-c3c7-4e35-8690-addc575935e9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 02:13:13 crc kubenswrapper[4764]: I1204 02:13:13.099762 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.502138 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.502818 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.505963 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.506807 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.542981 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.543385 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.547442 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 02:13:21 crc kubenswrapper[4764]: I1204 02:13:21.550777 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 02:13:22 crc kubenswrapper[4764]: I1204 02:13:22.152381 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 02:13:22 crc kubenswrapper[4764]: I1204 02:13:22.158259 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.338922 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m"] Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.340809 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.342916 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.343250 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.343583 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.343821 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.344076 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.345152 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-bm72t" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.352433 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.362583 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m"] Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.403730 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404042 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404189 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404351 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404501 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404644 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404800 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.404992 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.405121 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.405236 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.405378 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvwx\" (UniqueName: \"kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.507279 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.507336 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508333 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508367 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508397 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508472 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvwx\" (UniqueName: \"kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508571 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508599 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508640 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508689 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.508738 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.510526 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.511011 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.513856 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.514158 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.514767 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.514850 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.515561 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.516145 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.516169 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.523607 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.527276 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvwx\" (UniqueName: \"kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:23 crc kubenswrapper[4764]: I1204 02:13:23.668210 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:13:24 crc kubenswrapper[4764]: I1204 02:13:24.282927 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m"] Dec 04 02:13:25 crc kubenswrapper[4764]: I1204 02:13:25.232577 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" event={"ID":"69433a1d-a420-4643-9654-ceb18ac6556b","Type":"ContainerStarted","Data":"6f21fee00e7f033966bda1793ec30bf8d537b1f1d558cc07e8952692a03c73ad"} Dec 04 02:13:25 crc kubenswrapper[4764]: I1204 02:13:25.233159 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" event={"ID":"69433a1d-a420-4643-9654-ceb18ac6556b","Type":"ContainerStarted","Data":"12e3e281fa7fda4d92af031362641464c4cdc981ff9c0871c8e5b06fbe9fc91e"} Dec 04 02:13:25 crc kubenswrapper[4764]: I1204 02:13:25.291384 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" podStartSLOduration=1.815498278 podStartE2EDuration="2.291363512s" podCreationTimestamp="2025-12-04 02:13:23 +0000 UTC" firstStartedPulling="2025-12-04 02:13:24.303537647 +0000 UTC m=+9140.064862078" lastFinishedPulling="2025-12-04 02:13:24.779402871 +0000 UTC m=+9140.540727312" observedRunningTime="2025-12-04 02:13:25.276200079 +0000 UTC m=+9141.037524490" watchObservedRunningTime="2025-12-04 02:13:25.291363512 +0000 UTC m=+9141.052687923" Dec 04 02:13:27 crc kubenswrapper[4764]: I1204 02:13:27.546554 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:13:27 crc kubenswrapper[4764]: E1204 02:13:27.547027 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:13:39 crc kubenswrapper[4764]: I1204 02:13:39.546123 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:13:39 crc kubenswrapper[4764]: E1204 02:13:39.547394 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:13:50 crc kubenswrapper[4764]: I1204 02:13:50.546129 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:13:50 crc kubenswrapper[4764]: E1204 02:13:50.547012 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:14:02 crc kubenswrapper[4764]: I1204 02:14:02.546589 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:14:02 crc kubenswrapper[4764]: E1204 02:14:02.547653 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:14:13 crc kubenswrapper[4764]: I1204 02:14:13.546927 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:14:13 crc kubenswrapper[4764]: E1204 02:14:13.548172 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:14:28 crc kubenswrapper[4764]: I1204 02:14:28.546627 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:14:29 crc kubenswrapper[4764]: I1204 02:14:29.136203 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58"} Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.146562 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh"] Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.149011 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.156016 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.156138 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.169985 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh"] Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.229183 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.229292 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.229328 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl7kk\" (UniqueName: \"kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.330999 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.331077 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.331109 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl7kk\" (UniqueName: \"kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.331906 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.337288 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.348342 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl7kk\" (UniqueName: \"kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk\") pod \"collect-profiles-29413575-r78jh\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.474797 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:00 crc kubenswrapper[4764]: I1204 02:15:00.973756 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh"] Dec 04 02:15:01 crc kubenswrapper[4764]: E1204 02:15:01.468062 4764 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06a42f3f_0641_4741_8893_6a0ffcd356e2.slice/crio-conmon-96bd0781af619d6d8df55bfc42b661194616690b34900a36911a95c77b328879.scope\": RecentStats: unable to find data in memory cache]" Dec 04 02:15:01 crc kubenswrapper[4764]: I1204 02:15:01.535872 4764 generic.go:334] "Generic (PLEG): container finished" podID="06a42f3f-0641-4741-8893-6a0ffcd356e2" containerID="96bd0781af619d6d8df55bfc42b661194616690b34900a36911a95c77b328879" exitCode=0 Dec 04 02:15:01 crc kubenswrapper[4764]: I1204 02:15:01.535937 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" event={"ID":"06a42f3f-0641-4741-8893-6a0ffcd356e2","Type":"ContainerDied","Data":"96bd0781af619d6d8df55bfc42b661194616690b34900a36911a95c77b328879"} Dec 04 02:15:01 crc kubenswrapper[4764]: I1204 02:15:01.535972 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" event={"ID":"06a42f3f-0641-4741-8893-6a0ffcd356e2","Type":"ContainerStarted","Data":"f632852fc40122989558c97bb6bc7c6c57a3843cf35eeb95d37f03fb7b01a0c0"} Dec 04 02:15:02 crc kubenswrapper[4764]: I1204 02:15:02.979634 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.092981 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume\") pod \"06a42f3f-0641-4741-8893-6a0ffcd356e2\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.093137 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl7kk\" (UniqueName: \"kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk\") pod \"06a42f3f-0641-4741-8893-6a0ffcd356e2\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.093226 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume\") pod \"06a42f3f-0641-4741-8893-6a0ffcd356e2\" (UID: \"06a42f3f-0641-4741-8893-6a0ffcd356e2\") " Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.094162 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume" (OuterVolumeSpecName: "config-volume") pod "06a42f3f-0641-4741-8893-6a0ffcd356e2" (UID: "06a42f3f-0641-4741-8893-6a0ffcd356e2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.098328 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06a42f3f-0641-4741-8893-6a0ffcd356e2" (UID: "06a42f3f-0641-4741-8893-6a0ffcd356e2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.099133 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk" (OuterVolumeSpecName: "kube-api-access-fl7kk") pod "06a42f3f-0641-4741-8893-6a0ffcd356e2" (UID: "06a42f3f-0641-4741-8893-6a0ffcd356e2"). InnerVolumeSpecName "kube-api-access-fl7kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.197072 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a42f3f-0641-4741-8893-6a0ffcd356e2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.197449 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl7kk\" (UniqueName: \"kubernetes.io/projected/06a42f3f-0641-4741-8893-6a0ffcd356e2-kube-api-access-fl7kk\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.197697 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06a42f3f-0641-4741-8893-6a0ffcd356e2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.574262 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" event={"ID":"06a42f3f-0641-4741-8893-6a0ffcd356e2","Type":"ContainerDied","Data":"f632852fc40122989558c97bb6bc7c6c57a3843cf35eeb95d37f03fb7b01a0c0"} Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.574299 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f632852fc40122989558c97bb6bc7c6c57a3843cf35eeb95d37f03fb7b01a0c0" Dec 04 02:15:03 crc kubenswrapper[4764]: I1204 02:15:03.574331 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413575-r78jh" Dec 04 02:15:04 crc kubenswrapper[4764]: I1204 02:15:04.077954 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv"] Dec 04 02:15:04 crc kubenswrapper[4764]: I1204 02:15:04.093057 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413530-6cjzv"] Dec 04 02:15:04 crc kubenswrapper[4764]: I1204 02:15:04.566193 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8292e040-cca3-4f5c-817d-4c4c9f479e1f" path="/var/lib/kubelet/pods/8292e040-cca3-4f5c-817d-4c4c9f479e1f/volumes" Dec 04 02:15:22 crc kubenswrapper[4764]: I1204 02:15:22.803083 4764 scope.go:117] "RemoveContainer" containerID="e294fb91d22777c68fadea25804d715dfc95714ba1460292ac4ebf82ded0e421" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.607944 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:25 crc kubenswrapper[4764]: E1204 02:15:25.609114 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a42f3f-0641-4741-8893-6a0ffcd356e2" containerName="collect-profiles" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.609128 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a42f3f-0641-4741-8893-6a0ffcd356e2" containerName="collect-profiles" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.609564 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a42f3f-0641-4741-8893-6a0ffcd356e2" containerName="collect-profiles" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.616119 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.636654 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.768735 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.769048 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.769127 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h79cf\" (UniqueName: \"kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.870521 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.870572 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.870608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h79cf\" (UniqueName: \"kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.871120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.871199 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.894708 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h79cf\" (UniqueName: \"kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf\") pod \"redhat-marketplace-l8m8p\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:25 crc kubenswrapper[4764]: I1204 02:15:25.938069 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:26 crc kubenswrapper[4764]: I1204 02:15:26.453335 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:26 crc kubenswrapper[4764]: W1204 02:15:26.459553 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb9a3f3_dc61_4262_a628_a5da10c5466d.slice/crio-deff2fd4b1f662b59d3f35c50cb645883914a40a8de9efbf37d140aaa4bba7a9 WatchSource:0}: Error finding container deff2fd4b1f662b59d3f35c50cb645883914a40a8de9efbf37d140aaa4bba7a9: Status 404 returned error can't find the container with id deff2fd4b1f662b59d3f35c50cb645883914a40a8de9efbf37d140aaa4bba7a9 Dec 04 02:15:26 crc kubenswrapper[4764]: I1204 02:15:26.869947 4764 generic.go:334] "Generic (PLEG): container finished" podID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerID="ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529" exitCode=0 Dec 04 02:15:26 crc kubenswrapper[4764]: I1204 02:15:26.870015 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerDied","Data":"ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529"} Dec 04 02:15:26 crc kubenswrapper[4764]: I1204 02:15:26.870258 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerStarted","Data":"deff2fd4b1f662b59d3f35c50cb645883914a40a8de9efbf37d140aaa4bba7a9"} Dec 04 02:15:26 crc kubenswrapper[4764]: I1204 02:15:26.873363 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 02:15:27 crc kubenswrapper[4764]: I1204 02:15:27.883151 4764 generic.go:334] "Generic (PLEG): container finished" podID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerID="3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931" exitCode=0 Dec 04 02:15:27 crc kubenswrapper[4764]: I1204 02:15:27.883250 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerDied","Data":"3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931"} Dec 04 02:15:28 crc kubenswrapper[4764]: I1204 02:15:28.896775 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerStarted","Data":"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6"} Dec 04 02:15:28 crc kubenswrapper[4764]: I1204 02:15:28.920535 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8m8p" podStartSLOduration=2.481408426 podStartE2EDuration="3.920515961s" podCreationTimestamp="2025-12-04 02:15:25 +0000 UTC" firstStartedPulling="2025-12-04 02:15:26.872759922 +0000 UTC m=+9262.634084343" lastFinishedPulling="2025-12-04 02:15:28.311867457 +0000 UTC m=+9264.073191878" observedRunningTime="2025-12-04 02:15:28.912808402 +0000 UTC m=+9264.674132823" watchObservedRunningTime="2025-12-04 02:15:28.920515961 +0000 UTC m=+9264.681840382" Dec 04 02:15:35 crc kubenswrapper[4764]: I1204 02:15:35.938496 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:35 crc kubenswrapper[4764]: I1204 02:15:35.940948 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:36 crc kubenswrapper[4764]: I1204 02:15:36.606218 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:37 crc kubenswrapper[4764]: I1204 02:15:37.065447 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:37 crc kubenswrapper[4764]: I1204 02:15:37.172701 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.028022 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8m8p" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="registry-server" containerID="cri-o://9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6" gracePeriod=2 Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.527853 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.587286 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h79cf\" (UniqueName: \"kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf\") pod \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.587515 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities\") pod \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.587622 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content\") pod \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\" (UID: \"dbb9a3f3-dc61-4262-a628-a5da10c5466d\") " Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.588630 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities" (OuterVolumeSpecName: "utilities") pod "dbb9a3f3-dc61-4262-a628-a5da10c5466d" (UID: "dbb9a3f3-dc61-4262-a628-a5da10c5466d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.592771 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf" (OuterVolumeSpecName: "kube-api-access-h79cf") pod "dbb9a3f3-dc61-4262-a628-a5da10c5466d" (UID: "dbb9a3f3-dc61-4262-a628-a5da10c5466d"). InnerVolumeSpecName "kube-api-access-h79cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.609634 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbb9a3f3-dc61-4262-a628-a5da10c5466d" (UID: "dbb9a3f3-dc61-4262-a628-a5da10c5466d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.690520 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h79cf\" (UniqueName: \"kubernetes.io/projected/dbb9a3f3-dc61-4262-a628-a5da10c5466d-kube-api-access-h79cf\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.691321 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:39 crc kubenswrapper[4764]: I1204 02:15:39.691443 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbb9a3f3-dc61-4262-a628-a5da10c5466d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.040154 4764 generic.go:334] "Generic (PLEG): container finished" podID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerID="9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6" exitCode=0 Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.040219 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerDied","Data":"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6"} Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.040280 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8m8p" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.040297 4764 scope.go:117] "RemoveContainer" containerID="9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.040286 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8m8p" event={"ID":"dbb9a3f3-dc61-4262-a628-a5da10c5466d","Type":"ContainerDied","Data":"deff2fd4b1f662b59d3f35c50cb645883914a40a8de9efbf37d140aaa4bba7a9"} Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.066590 4764 scope.go:117] "RemoveContainer" containerID="3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.093406 4764 scope.go:117] "RemoveContainer" containerID="ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.100273 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.112488 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8m8p"] Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.171213 4764 scope.go:117] "RemoveContainer" containerID="9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6" Dec 04 02:15:40 crc kubenswrapper[4764]: E1204 02:15:40.171852 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6\": container with ID starting with 9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6 not found: ID does not exist" containerID="9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.171896 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6"} err="failed to get container status \"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6\": rpc error: code = NotFound desc = could not find container \"9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6\": container with ID starting with 9b4110c86133112306f79ab4f7973923449ac902c075375d47e0f2e3437486e6 not found: ID does not exist" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.171926 4764 scope.go:117] "RemoveContainer" containerID="3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931" Dec 04 02:15:40 crc kubenswrapper[4764]: E1204 02:15:40.172392 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931\": container with ID starting with 3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931 not found: ID does not exist" containerID="3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.172424 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931"} err="failed to get container status \"3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931\": rpc error: code = NotFound desc = could not find container \"3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931\": container with ID starting with 3ee905d02c1ccd512f08d8bc22f9191d62efdc946a863ebf95ebd704f8537931 not found: ID does not exist" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.172492 4764 scope.go:117] "RemoveContainer" containerID="ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529" Dec 04 02:15:40 crc kubenswrapper[4764]: E1204 02:15:40.172894 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529\": container with ID starting with ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529 not found: ID does not exist" containerID="ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.172946 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529"} err="failed to get container status \"ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529\": rpc error: code = NotFound desc = could not find container \"ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529\": container with ID starting with ce5a6a0e03e9eb9fbeec4bfaec765f558287e590a7929eae9134202e3eec1529 not found: ID does not exist" Dec 04 02:15:40 crc kubenswrapper[4764]: I1204 02:15:40.565770 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" path="/var/lib/kubelet/pods/dbb9a3f3-dc61-4262-a628-a5da10c5466d/volumes" Dec 04 02:16:50 crc kubenswrapper[4764]: I1204 02:16:50.868987 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:16:50 crc kubenswrapper[4764]: I1204 02:16:50.869532 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:17:20 crc kubenswrapper[4764]: I1204 02:17:20.869508 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:17:20 crc kubenswrapper[4764]: I1204 02:17:20.870275 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:17:50 crc kubenswrapper[4764]: I1204 02:17:50.868575 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:17:50 crc kubenswrapper[4764]: I1204 02:17:50.869759 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:17:50 crc kubenswrapper[4764]: I1204 02:17:50.869823 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:17:50 crc kubenswrapper[4764]: I1204 02:17:50.871970 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:17:50 crc kubenswrapper[4764]: I1204 02:17:50.872108 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58" gracePeriod=600 Dec 04 02:17:51 crc kubenswrapper[4764]: I1204 02:17:51.890592 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58" exitCode=0 Dec 04 02:17:51 crc kubenswrapper[4764]: I1204 02:17:51.890665 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58"} Dec 04 02:17:51 crc kubenswrapper[4764]: I1204 02:17:51.891087 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000"} Dec 04 02:17:51 crc kubenswrapper[4764]: I1204 02:17:51.891120 4764 scope.go:117] "RemoveContainer" containerID="ac9aa61e1cf02196d4204a5d84f1c4f1fad312c9c45bcf33b9696f6ab72de78b" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.103097 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:18:57 crc kubenswrapper[4764]: E1204 02:18:57.104530 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="extract-content" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.104551 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="extract-content" Dec 04 02:18:57 crc kubenswrapper[4764]: E1204 02:18:57.104604 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="registry-server" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.104617 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="registry-server" Dec 04 02:18:57 crc kubenswrapper[4764]: E1204 02:18:57.104645 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="extract-utilities" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.104658 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="extract-utilities" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.105039 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb9a3f3-dc61-4262-a628-a5da10c5466d" containerName="registry-server" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.108500 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.143786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.214960 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.215027 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.215231 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqsbn\" (UniqueName: \"kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.317608 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.317748 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqsbn\" (UniqueName: \"kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.317863 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.318360 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.318387 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.356689 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqsbn\" (UniqueName: \"kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn\") pod \"certified-operators-7dj5n\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:57 crc kubenswrapper[4764]: I1204 02:18:57.437783 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:18:58 crc kubenswrapper[4764]: I1204 02:18:58.019235 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:18:58 crc kubenswrapper[4764]: W1204 02:18:58.019227 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e58edaf_f2cd_4a62_866c_2be2d26eab2f.slice/crio-9bf082d861924451cbd0ee8d3dc00595b155e938763e7af4b7d0be7f90f38066 WatchSource:0}: Error finding container 9bf082d861924451cbd0ee8d3dc00595b155e938763e7af4b7d0be7f90f38066: Status 404 returned error can't find the container with id 9bf082d861924451cbd0ee8d3dc00595b155e938763e7af4b7d0be7f90f38066 Dec 04 02:18:58 crc kubenswrapper[4764]: I1204 02:18:58.811655 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerID="bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82" exitCode=0 Dec 04 02:18:58 crc kubenswrapper[4764]: I1204 02:18:58.811817 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerDied","Data":"bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82"} Dec 04 02:18:58 crc kubenswrapper[4764]: I1204 02:18:58.812574 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerStarted","Data":"9bf082d861924451cbd0ee8d3dc00595b155e938763e7af4b7d0be7f90f38066"} Dec 04 02:18:59 crc kubenswrapper[4764]: I1204 02:18:59.831296 4764 generic.go:334] "Generic (PLEG): container finished" podID="69433a1d-a420-4643-9654-ceb18ac6556b" containerID="6f21fee00e7f033966bda1793ec30bf8d537b1f1d558cc07e8952692a03c73ad" exitCode=0 Dec 04 02:18:59 crc kubenswrapper[4764]: I1204 02:18:59.831425 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" event={"ID":"69433a1d-a420-4643-9654-ceb18ac6556b","Type":"ContainerDied","Data":"6f21fee00e7f033966bda1793ec30bf8d537b1f1d558cc07e8952692a03c73ad"} Dec 04 02:19:00 crc kubenswrapper[4764]: I1204 02:19:00.848857 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerID="a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f" exitCode=0 Dec 04 02:19:00 crc kubenswrapper[4764]: I1204 02:19:00.848991 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerDied","Data":"a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f"} Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.494456 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628489 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628602 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628651 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628685 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628709 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzvwx\" (UniqueName: \"kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628755 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628825 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628882 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628921 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628952 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.628986 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0\") pod \"69433a1d-a420-4643-9654-ceb18ac6556b\" (UID: \"69433a1d-a420-4643-9654-ceb18ac6556b\") " Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.635016 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx" (OuterVolumeSpecName: "kube-api-access-dzvwx") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "kube-api-access-dzvwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.642613 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph" (OuterVolumeSpecName: "ceph") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.660798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.661845 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory" (OuterVolumeSpecName: "inventory") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.662359 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.668881 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.669031 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.675529 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.678071 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.688972 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.690987 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "69433a1d-a420-4643-9654-ceb18ac6556b" (UID: "69433a1d-a420-4643-9654-ceb18ac6556b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731770 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731799 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731809 4764 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731818 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731827 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731836 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731844 4764 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ceph\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731852 4764 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731859 4764 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731867 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzvwx\" (UniqueName: \"kubernetes.io/projected/69433a1d-a420-4643-9654-ceb18ac6556b-kube-api-access-dzvwx\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.731875 4764 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69433a1d-a420-4643-9654-ceb18ac6556b-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.865213 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.865211 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m" event={"ID":"69433a1d-a420-4643-9654-ceb18ac6556b","Type":"ContainerDied","Data":"12e3e281fa7fda4d92af031362641464c4cdc981ff9c0871c8e5b06fbe9fc91e"} Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.865413 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e3e281fa7fda4d92af031362641464c4cdc981ff9c0871c8e5b06fbe9fc91e" Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.870120 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerStarted","Data":"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632"} Dec 04 02:19:01 crc kubenswrapper[4764]: I1204 02:19:01.891662 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7dj5n" podStartSLOduration=2.376466704 podStartE2EDuration="4.891643651s" podCreationTimestamp="2025-12-04 02:18:57 +0000 UTC" firstStartedPulling="2025-12-04 02:18:58.82118767 +0000 UTC m=+9474.582512121" lastFinishedPulling="2025-12-04 02:19:01.336364657 +0000 UTC m=+9477.097689068" observedRunningTime="2025-12-04 02:19:01.888390472 +0000 UTC m=+9477.649714883" watchObservedRunningTime="2025-12-04 02:19:01.891643651 +0000 UTC m=+9477.652968062" Dec 04 02:19:07 crc kubenswrapper[4764]: I1204 02:19:07.438194 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:07 crc kubenswrapper[4764]: I1204 02:19:07.438690 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:07 crc kubenswrapper[4764]: I1204 02:19:07.493367 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:07 crc kubenswrapper[4764]: I1204 02:19:07.996181 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:08 crc kubenswrapper[4764]: I1204 02:19:08.044842 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:19:09 crc kubenswrapper[4764]: I1204 02:19:09.961419 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7dj5n" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="registry-server" containerID="cri-o://da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632" gracePeriod=2 Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.510476 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.533176 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqsbn\" (UniqueName: \"kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn\") pod \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.533279 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities\") pod \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.533382 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content\") pod \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\" (UID: \"0e58edaf-f2cd-4a62-866c-2be2d26eab2f\") " Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.534593 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities" (OuterVolumeSpecName: "utilities") pod "0e58edaf-f2cd-4a62-866c-2be2d26eab2f" (UID: "0e58edaf-f2cd-4a62-866c-2be2d26eab2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.540970 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn" (OuterVolumeSpecName: "kube-api-access-vqsbn") pod "0e58edaf-f2cd-4a62-866c-2be2d26eab2f" (UID: "0e58edaf-f2cd-4a62-866c-2be2d26eab2f"). InnerVolumeSpecName "kube-api-access-vqsbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.632408 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e58edaf-f2cd-4a62-866c-2be2d26eab2f" (UID: "0e58edaf-f2cd-4a62-866c-2be2d26eab2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.635367 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.635384 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqsbn\" (UniqueName: \"kubernetes.io/projected/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-kube-api-access-vqsbn\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.635395 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e58edaf-f2cd-4a62-866c-2be2d26eab2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.981157 4764 generic.go:334] "Generic (PLEG): container finished" podID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerID="da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632" exitCode=0 Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.981214 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerDied","Data":"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632"} Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.981241 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dj5n" event={"ID":"0e58edaf-f2cd-4a62-866c-2be2d26eab2f","Type":"ContainerDied","Data":"9bf082d861924451cbd0ee8d3dc00595b155e938763e7af4b7d0be7f90f38066"} Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.981257 4764 scope.go:117] "RemoveContainer" containerID="da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632" Dec 04 02:19:10 crc kubenswrapper[4764]: I1204 02:19:10.981435 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dj5n" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.019274 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.029543 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7dj5n"] Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.058026 4764 scope.go:117] "RemoveContainer" containerID="a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.078963 4764 scope.go:117] "RemoveContainer" containerID="bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.127230 4764 scope.go:117] "RemoveContainer" containerID="da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632" Dec 04 02:19:11 crc kubenswrapper[4764]: E1204 02:19:11.127599 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632\": container with ID starting with da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632 not found: ID does not exist" containerID="da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.127647 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632"} err="failed to get container status \"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632\": rpc error: code = NotFound desc = could not find container \"da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632\": container with ID starting with da6c56856587cc77e5e1d8cabfa82208905b3a82b9e529f0436c7a4cef626632 not found: ID does not exist" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.127678 4764 scope.go:117] "RemoveContainer" containerID="a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f" Dec 04 02:19:11 crc kubenswrapper[4764]: E1204 02:19:11.128141 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f\": container with ID starting with a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f not found: ID does not exist" containerID="a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.128174 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f"} err="failed to get container status \"a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f\": rpc error: code = NotFound desc = could not find container \"a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f\": container with ID starting with a76146b45abf6297c7ee70b14dda446f7a1796264b286f632a579c67be59127f not found: ID does not exist" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.128195 4764 scope.go:117] "RemoveContainer" containerID="bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82" Dec 04 02:19:11 crc kubenswrapper[4764]: E1204 02:19:11.128438 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82\": container with ID starting with bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82 not found: ID does not exist" containerID="bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82" Dec 04 02:19:11 crc kubenswrapper[4764]: I1204 02:19:11.128469 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82"} err="failed to get container status \"bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82\": rpc error: code = NotFound desc = could not find container \"bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82\": container with ID starting with bf8c79c352a60272bb39f78fd2edb589c11c35f451e67613ba61d77543c7fd82 not found: ID does not exist" Dec 04 02:19:12 crc kubenswrapper[4764]: I1204 02:19:12.568659 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" path="/var/lib/kubelet/pods/0e58edaf-f2cd-4a62-866c-2be2d26eab2f/volumes" Dec 04 02:19:58 crc kubenswrapper[4764]: E1204 02:19:58.587351 4764 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:39572->38.102.83.13:39483: write tcp 38.102.83.13:39572->38.102.83.13:39483: write: broken pipe Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.749416 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:09 crc kubenswrapper[4764]: E1204 02:20:09.750244 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69433a1d-a420-4643-9654-ceb18ac6556b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750258 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="69433a1d-a420-4643-9654-ceb18ac6556b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 04 02:20:09 crc kubenswrapper[4764]: E1204 02:20:09.750276 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="extract-content" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750283 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="extract-content" Dec 04 02:20:09 crc kubenswrapper[4764]: E1204 02:20:09.750299 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="extract-utilities" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750305 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="extract-utilities" Dec 04 02:20:09 crc kubenswrapper[4764]: E1204 02:20:09.750319 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="registry-server" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750325 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="registry-server" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750525 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="69433a1d-a420-4643-9654-ceb18ac6556b" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.750545 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e58edaf-f2cd-4a62-866c-2be2d26eab2f" containerName="registry-server" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.751996 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.771437 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.819199 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.819333 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.819374 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56zzt\" (UniqueName: \"kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.922500 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.922612 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.922644 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56zzt\" (UniqueName: \"kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.923118 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.923141 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:09 crc kubenswrapper[4764]: I1204 02:20:09.964052 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56zzt\" (UniqueName: \"kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt\") pod \"redhat-operators-fp87f\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:10 crc kubenswrapper[4764]: I1204 02:20:10.126290 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:10 crc kubenswrapper[4764]: I1204 02:20:10.660485 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:10 crc kubenswrapper[4764]: I1204 02:20:10.775778 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerStarted","Data":"641361e784bc7cbcc0206083e7c6fb3c3a3e5f20b35f63d275a9a1478b85a307"} Dec 04 02:20:11 crc kubenswrapper[4764]: I1204 02:20:11.791358 4764 generic.go:334] "Generic (PLEG): container finished" podID="a377522d-311a-44d4-8c36-74fef2eb8804" containerID="df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452" exitCode=0 Dec 04 02:20:11 crc kubenswrapper[4764]: I1204 02:20:11.791494 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerDied","Data":"df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452"} Dec 04 02:20:12 crc kubenswrapper[4764]: I1204 02:20:12.807525 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerStarted","Data":"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99"} Dec 04 02:20:15 crc kubenswrapper[4764]: I1204 02:20:15.851979 4764 generic.go:334] "Generic (PLEG): container finished" podID="a377522d-311a-44d4-8c36-74fef2eb8804" containerID="71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99" exitCode=0 Dec 04 02:20:15 crc kubenswrapper[4764]: I1204 02:20:15.852426 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerDied","Data":"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99"} Dec 04 02:20:16 crc kubenswrapper[4764]: I1204 02:20:16.865024 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerStarted","Data":"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b"} Dec 04 02:20:16 crc kubenswrapper[4764]: I1204 02:20:16.896183 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fp87f" podStartSLOduration=3.418251668 podStartE2EDuration="7.896158096s" podCreationTimestamp="2025-12-04 02:20:09 +0000 UTC" firstStartedPulling="2025-12-04 02:20:11.794950683 +0000 UTC m=+9547.556275094" lastFinishedPulling="2025-12-04 02:20:16.272857091 +0000 UTC m=+9552.034181522" observedRunningTime="2025-12-04 02:20:16.884984441 +0000 UTC m=+9552.646308852" watchObservedRunningTime="2025-12-04 02:20:16.896158096 +0000 UTC m=+9552.657482547" Dec 04 02:20:20 crc kubenswrapper[4764]: I1204 02:20:20.126882 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:20 crc kubenswrapper[4764]: I1204 02:20:20.127306 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:20 crc kubenswrapper[4764]: I1204 02:20:20.875896 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:20:20 crc kubenswrapper[4764]: I1204 02:20:20.876246 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:20:21 crc kubenswrapper[4764]: I1204 02:20:21.171926 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp87f" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="registry-server" probeResult="failure" output=< Dec 04 02:20:21 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 02:20:21 crc kubenswrapper[4764]: > Dec 04 02:20:30 crc kubenswrapper[4764]: I1204 02:20:30.272179 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:30 crc kubenswrapper[4764]: I1204 02:20:30.475706 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:30 crc kubenswrapper[4764]: I1204 02:20:30.558794 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.044690 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fp87f" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="registry-server" containerID="cri-o://371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b" gracePeriod=2 Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.621451 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.780355 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities\") pod \"a377522d-311a-44d4-8c36-74fef2eb8804\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.781020 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content\") pod \"a377522d-311a-44d4-8c36-74fef2eb8804\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.781459 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56zzt\" (UniqueName: \"kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt\") pod \"a377522d-311a-44d4-8c36-74fef2eb8804\" (UID: \"a377522d-311a-44d4-8c36-74fef2eb8804\") " Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.781616 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities" (OuterVolumeSpecName: "utilities") pod "a377522d-311a-44d4-8c36-74fef2eb8804" (UID: "a377522d-311a-44d4-8c36-74fef2eb8804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.783063 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.786226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt" (OuterVolumeSpecName: "kube-api-access-56zzt") pod "a377522d-311a-44d4-8c36-74fef2eb8804" (UID: "a377522d-311a-44d4-8c36-74fef2eb8804"). InnerVolumeSpecName "kube-api-access-56zzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.885999 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56zzt\" (UniqueName: \"kubernetes.io/projected/a377522d-311a-44d4-8c36-74fef2eb8804-kube-api-access-56zzt\") on node \"crc\" DevicePath \"\"" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.888608 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a377522d-311a-44d4-8c36-74fef2eb8804" (UID: "a377522d-311a-44d4-8c36-74fef2eb8804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:20:32 crc kubenswrapper[4764]: I1204 02:20:32.990049 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a377522d-311a-44d4-8c36-74fef2eb8804-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.069220 4764 generic.go:334] "Generic (PLEG): container finished" podID="a377522d-311a-44d4-8c36-74fef2eb8804" containerID="371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b" exitCode=0 Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.069273 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerDied","Data":"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b"} Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.069306 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp87f" event={"ID":"a377522d-311a-44d4-8c36-74fef2eb8804","Type":"ContainerDied","Data":"641361e784bc7cbcc0206083e7c6fb3c3a3e5f20b35f63d275a9a1478b85a307"} Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.069322 4764 scope.go:117] "RemoveContainer" containerID="371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.069468 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp87f" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.091573 4764 scope.go:117] "RemoveContainer" containerID="71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.128420 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.144349 4764 scope.go:117] "RemoveContainer" containerID="df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.150340 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fp87f"] Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.212706 4764 scope.go:117] "RemoveContainer" containerID="371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b" Dec 04 02:20:33 crc kubenswrapper[4764]: E1204 02:20:33.213393 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b\": container with ID starting with 371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b not found: ID does not exist" containerID="371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.213512 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b"} err="failed to get container status \"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b\": rpc error: code = NotFound desc = could not find container \"371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b\": container with ID starting with 371a615e2545147ca9d7184c1cfa569d1ef30a7a7376e0f3c06cbd09fd8a8e7b not found: ID does not exist" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.213626 4764 scope.go:117] "RemoveContainer" containerID="71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99" Dec 04 02:20:33 crc kubenswrapper[4764]: E1204 02:20:33.214148 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99\": container with ID starting with 71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99 not found: ID does not exist" containerID="71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.214272 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99"} err="failed to get container status \"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99\": rpc error: code = NotFound desc = could not find container \"71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99\": container with ID starting with 71f145f1fe044981f43376fa716b6eb829583c1c866ecd09e9417f5e7b9cab99 not found: ID does not exist" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.214377 4764 scope.go:117] "RemoveContainer" containerID="df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452" Dec 04 02:20:33 crc kubenswrapper[4764]: E1204 02:20:33.214850 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452\": container with ID starting with df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452 not found: ID does not exist" containerID="df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452" Dec 04 02:20:33 crc kubenswrapper[4764]: I1204 02:20:33.214956 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452"} err="failed to get container status \"df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452\": rpc error: code = NotFound desc = could not find container \"df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452\": container with ID starting with df57dc434e34dd9811d8a4268532655d0624d1571baa82b2875b55bc80c50452 not found: ID does not exist" Dec 04 02:20:34 crc kubenswrapper[4764]: I1204 02:20:34.564964 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" path="/var/lib/kubelet/pods/a377522d-311a-44d4-8c36-74fef2eb8804/volumes" Dec 04 02:20:50 crc kubenswrapper[4764]: I1204 02:20:50.869352 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:20:50 crc kubenswrapper[4764]: I1204 02:20:50.870060 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:21:09 crc kubenswrapper[4764]: I1204 02:21:09.752368 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 02:21:09 crc kubenswrapper[4764]: I1204 02:21:09.753855 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="e67613d0-4acc-4685-be57-a7a4f9d44a08" containerName="adoption" containerID="cri-o://0df504eadc1d838dee7c9c7ea4f5bed08b7ad14bb3b1e3052b82054f986a0a8b" gracePeriod=30 Dec 04 02:21:20 crc kubenswrapper[4764]: I1204 02:21:20.868241 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:21:20 crc kubenswrapper[4764]: I1204 02:21:20.868746 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:21:20 crc kubenswrapper[4764]: I1204 02:21:20.868789 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:21:20 crc kubenswrapper[4764]: I1204 02:21:20.869411 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:21:20 crc kubenswrapper[4764]: I1204 02:21:20.869457 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" gracePeriod=600 Dec 04 02:21:21 crc kubenswrapper[4764]: E1204 02:21:21.508101 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:21:21 crc kubenswrapper[4764]: I1204 02:21:21.694474 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" exitCode=0 Dec 04 02:21:21 crc kubenswrapper[4764]: I1204 02:21:21.694512 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000"} Dec 04 02:21:21 crc kubenswrapper[4764]: I1204 02:21:21.694818 4764 scope.go:117] "RemoveContainer" containerID="d2780a14e7d9a554ef96fbe09beeb7b2f53558bf6c717cfe0101f59bd8944f58" Dec 04 02:21:21 crc kubenswrapper[4764]: I1204 02:21:21.695703 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:21:21 crc kubenswrapper[4764]: E1204 02:21:21.696304 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:21:33 crc kubenswrapper[4764]: I1204 02:21:33.546758 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:21:33 crc kubenswrapper[4764]: E1204 02:21:33.549309 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:21:39 crc kubenswrapper[4764]: I1204 02:21:39.947714 4764 generic.go:334] "Generic (PLEG): container finished" podID="e67613d0-4acc-4685-be57-a7a4f9d44a08" containerID="0df504eadc1d838dee7c9c7ea4f5bed08b7ad14bb3b1e3052b82054f986a0a8b" exitCode=137 Dec 04 02:21:39 crc kubenswrapper[4764]: I1204 02:21:39.947840 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e67613d0-4acc-4685-be57-a7a4f9d44a08","Type":"ContainerDied","Data":"0df504eadc1d838dee7c9c7ea4f5bed08b7ad14bb3b1e3052b82054f986a0a8b"} Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.375082 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.537791 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7fdv\" (UniqueName: \"kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv\") pod \"e67613d0-4acc-4685-be57-a7a4f9d44a08\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.538607 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") pod \"e67613d0-4acc-4685-be57-a7a4f9d44a08\" (UID: \"e67613d0-4acc-4685-be57-a7a4f9d44a08\") " Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.546006 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv" (OuterVolumeSpecName: "kube-api-access-w7fdv") pod "e67613d0-4acc-4685-be57-a7a4f9d44a08" (UID: "e67613d0-4acc-4685-be57-a7a4f9d44a08"). InnerVolumeSpecName "kube-api-access-w7fdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.575805 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d" (OuterVolumeSpecName: "mariadb-data") pod "e67613d0-4acc-4685-be57-a7a4f9d44a08" (UID: "e67613d0-4acc-4685-be57-a7a4f9d44a08"). InnerVolumeSpecName "pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.642943 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") on node \"crc\" " Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.642983 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7fdv\" (UniqueName: \"kubernetes.io/projected/e67613d0-4acc-4685-be57-a7a4f9d44a08-kube-api-access-w7fdv\") on node \"crc\" DevicePath \"\"" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.689334 4764 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.689465 4764 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d") on node "crc" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.745275 4764 reconciler_common.go:293] "Volume detached for volume \"pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-72f86e71-6d7c-4c8b-9f54-bfda2152162d\") on node \"crc\" DevicePath \"\"" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.963098 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e67613d0-4acc-4685-be57-a7a4f9d44a08","Type":"ContainerDied","Data":"d6ed7f4aa744de1c973ac117a7f929840cf5b00260cb1554f8fb31e4be8ef8bd"} Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.963212 4764 scope.go:117] "RemoveContainer" containerID="0df504eadc1d838dee7c9c7ea4f5bed08b7ad14bb3b1e3052b82054f986a0a8b" Dec 04 02:21:40 crc kubenswrapper[4764]: I1204 02:21:40.963137 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 04 02:21:41 crc kubenswrapper[4764]: I1204 02:21:41.004001 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 02:21:41 crc kubenswrapper[4764]: I1204 02:21:41.013520 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 04 02:21:41 crc kubenswrapper[4764]: I1204 02:21:41.705015 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 04 02:21:41 crc kubenswrapper[4764]: I1204 02:21:41.705571 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="a70e1201-59bc-45ce-b002-848176bc24cf" containerName="adoption" containerID="cri-o://a094c7a69f206ccbfa8fef0e881fa8c00abdc49bb06aebbdef92bbdb3d484e3d" gracePeriod=30 Dec 04 02:21:42 crc kubenswrapper[4764]: I1204 02:21:42.563103 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67613d0-4acc-4685-be57-a7a4f9d44a08" path="/var/lib/kubelet/pods/e67613d0-4acc-4685-be57-a7a4f9d44a08/volumes" Dec 04 02:21:46 crc kubenswrapper[4764]: I1204 02:21:46.546900 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:21:46 crc kubenswrapper[4764]: E1204 02:21:46.547825 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:01 crc kubenswrapper[4764]: I1204 02:22:01.546147 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:01 crc kubenswrapper[4764]: E1204 02:22:01.546796 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.388644 4764 generic.go:334] "Generic (PLEG): container finished" podID="a70e1201-59bc-45ce-b002-848176bc24cf" containerID="a094c7a69f206ccbfa8fef0e881fa8c00abdc49bb06aebbdef92bbdb3d484e3d" exitCode=137 Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.388746 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a70e1201-59bc-45ce-b002-848176bc24cf","Type":"ContainerDied","Data":"a094c7a69f206ccbfa8fef0e881fa8c00abdc49bb06aebbdef92bbdb3d484e3d"} Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.389278 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a70e1201-59bc-45ce-b002-848176bc24cf","Type":"ContainerDied","Data":"447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a"} Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.389296 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="447ae8b144c85ab806fc1aabf8d2c43b9901653e7c20b93ed7fe6798ba32be1a" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.425537 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.517596 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert\") pod \"a70e1201-59bc-45ce-b002-848176bc24cf\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.518096 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") pod \"a70e1201-59bc-45ce-b002-848176bc24cf\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.518348 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txvlf\" (UniqueName: \"kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf\") pod \"a70e1201-59bc-45ce-b002-848176bc24cf\" (UID: \"a70e1201-59bc-45ce-b002-848176bc24cf\") " Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.524613 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "a70e1201-59bc-45ce-b002-848176bc24cf" (UID: "a70e1201-59bc-45ce-b002-848176bc24cf"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.529077 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf" (OuterVolumeSpecName: "kube-api-access-txvlf") pod "a70e1201-59bc-45ce-b002-848176bc24cf" (UID: "a70e1201-59bc-45ce-b002-848176bc24cf"). InnerVolumeSpecName "kube-api-access-txvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.537583 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa" (OuterVolumeSpecName: "ovn-data") pod "a70e1201-59bc-45ce-b002-848176bc24cf" (UID: "a70e1201-59bc-45ce-b002-848176bc24cf"). InnerVolumeSpecName "pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.546834 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:12 crc kubenswrapper[4764]: E1204 02:22:12.547114 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.620902 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txvlf\" (UniqueName: \"kubernetes.io/projected/a70e1201-59bc-45ce-b002-848176bc24cf-kube-api-access-txvlf\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.620935 4764 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a70e1201-59bc-45ce-b002-848176bc24cf-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.620960 4764 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") on node \"crc\" " Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.646898 4764 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.647415 4764 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa") on node "crc" Dec 04 02:22:12 crc kubenswrapper[4764]: I1204 02:22:12.725175 4764 reconciler_common.go:293] "Volume detached for volume \"pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-aeaf306a-2d06-478c-9136-c61f8e8e1bfa\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:13 crc kubenswrapper[4764]: I1204 02:22:13.409040 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 04 02:22:13 crc kubenswrapper[4764]: I1204 02:22:13.459775 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 04 02:22:13 crc kubenswrapper[4764]: I1204 02:22:13.474340 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 04 02:22:14 crc kubenswrapper[4764]: I1204 02:22:14.556672 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70e1201-59bc-45ce-b002-848176bc24cf" path="/var/lib/kubelet/pods/a70e1201-59bc-45ce-b002-848176bc24cf/volumes" Dec 04 02:22:23 crc kubenswrapper[4764]: I1204 02:22:23.099662 4764 scope.go:117] "RemoveContainer" containerID="a094c7a69f206ccbfa8fef0e881fa8c00abdc49bb06aebbdef92bbdb3d484e3d" Dec 04 02:22:24 crc kubenswrapper[4764]: I1204 02:22:24.559999 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:24 crc kubenswrapper[4764]: E1204 02:22:24.560907 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.507984 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:25 crc kubenswrapper[4764]: E1204 02:22:25.508787 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="extract-utilities" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.508811 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="extract-utilities" Dec 04 02:22:25 crc kubenswrapper[4764]: E1204 02:22:25.508846 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="extract-content" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.508856 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="extract-content" Dec 04 02:22:25 crc kubenswrapper[4764]: E1204 02:22:25.508887 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="registry-server" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.508896 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="registry-server" Dec 04 02:22:25 crc kubenswrapper[4764]: E1204 02:22:25.508908 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67613d0-4acc-4685-be57-a7a4f9d44a08" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.508915 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67613d0-4acc-4685-be57-a7a4f9d44a08" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: E1204 02:22:25.508932 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70e1201-59bc-45ce-b002-848176bc24cf" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.508940 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70e1201-59bc-45ce-b002-848176bc24cf" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.509176 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70e1201-59bc-45ce-b002-848176bc24cf" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.509193 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="a377522d-311a-44d4-8c36-74fef2eb8804" containerName="registry-server" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.509208 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67613d0-4acc-4685-be57-a7a4f9d44a08" containerName="adoption" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.512320 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.539390 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.608339 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwrk\" (UniqueName: \"kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.608394 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.608431 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.710294 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwrk\" (UniqueName: \"kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.710358 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.710394 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.711120 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.711144 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.735528 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwrk\" (UniqueName: \"kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk\") pod \"community-operators-4xfq9\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:25 crc kubenswrapper[4764]: I1204 02:22:25.838525 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:26 crc kubenswrapper[4764]: I1204 02:22:26.358457 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:26 crc kubenswrapper[4764]: I1204 02:22:26.576978 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerStarted","Data":"52f82ea2c926ea65c8c70dc11039aef4ac80d076384d95a21137fdf61bff12fd"} Dec 04 02:22:27 crc kubenswrapper[4764]: I1204 02:22:27.578898 4764 generic.go:334] "Generic (PLEG): container finished" podID="e226d75a-7125-4132-816c-7983ef6b1532" containerID="a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54" exitCode=0 Dec 04 02:22:27 crc kubenswrapper[4764]: I1204 02:22:27.579176 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerDied","Data":"a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54"} Dec 04 02:22:27 crc kubenswrapper[4764]: I1204 02:22:27.581608 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 02:22:28 crc kubenswrapper[4764]: I1204 02:22:28.594301 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerStarted","Data":"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e"} Dec 04 02:22:29 crc kubenswrapper[4764]: I1204 02:22:29.607542 4764 generic.go:334] "Generic (PLEG): container finished" podID="e226d75a-7125-4132-816c-7983ef6b1532" containerID="71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e" exitCode=0 Dec 04 02:22:29 crc kubenswrapper[4764]: I1204 02:22:29.607645 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerDied","Data":"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e"} Dec 04 02:22:30 crc kubenswrapper[4764]: I1204 02:22:30.619656 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerStarted","Data":"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984"} Dec 04 02:22:30 crc kubenswrapper[4764]: I1204 02:22:30.648755 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4xfq9" podStartSLOduration=3.212117092 podStartE2EDuration="5.648708384s" podCreationTimestamp="2025-12-04 02:22:25 +0000 UTC" firstStartedPulling="2025-12-04 02:22:27.581198881 +0000 UTC m=+9683.342523312" lastFinishedPulling="2025-12-04 02:22:30.017790183 +0000 UTC m=+9685.779114604" observedRunningTime="2025-12-04 02:22:30.638155435 +0000 UTC m=+9686.399479836" watchObservedRunningTime="2025-12-04 02:22:30.648708384 +0000 UTC m=+9686.410032825" Dec 04 02:22:35 crc kubenswrapper[4764]: I1204 02:22:35.546102 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:35 crc kubenswrapper[4764]: E1204 02:22:35.546897 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:35 crc kubenswrapper[4764]: I1204 02:22:35.841301 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:35 crc kubenswrapper[4764]: I1204 02:22:35.841377 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:36 crc kubenswrapper[4764]: I1204 02:22:36.625408 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:36 crc kubenswrapper[4764]: I1204 02:22:36.747148 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:36 crc kubenswrapper[4764]: I1204 02:22:36.866099 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:38 crc kubenswrapper[4764]: I1204 02:22:38.715597 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4xfq9" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="registry-server" containerID="cri-o://6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984" gracePeriod=2 Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.267374 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.334627 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content\") pod \"e226d75a-7125-4132-816c-7983ef6b1532\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.334712 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pwrk\" (UniqueName: \"kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk\") pod \"e226d75a-7125-4132-816c-7983ef6b1532\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.334816 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities\") pod \"e226d75a-7125-4132-816c-7983ef6b1532\" (UID: \"e226d75a-7125-4132-816c-7983ef6b1532\") " Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.335984 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities" (OuterVolumeSpecName: "utilities") pod "e226d75a-7125-4132-816c-7983ef6b1532" (UID: "e226d75a-7125-4132-816c-7983ef6b1532"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.342364 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk" (OuterVolumeSpecName: "kube-api-access-8pwrk") pod "e226d75a-7125-4132-816c-7983ef6b1532" (UID: "e226d75a-7125-4132-816c-7983ef6b1532"). InnerVolumeSpecName "kube-api-access-8pwrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.386226 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e226d75a-7125-4132-816c-7983ef6b1532" (UID: "e226d75a-7125-4132-816c-7983ef6b1532"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.438988 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.439029 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pwrk\" (UniqueName: \"kubernetes.io/projected/e226d75a-7125-4132-816c-7983ef6b1532-kube-api-access-8pwrk\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.439044 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e226d75a-7125-4132-816c-7983ef6b1532-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.737957 4764 generic.go:334] "Generic (PLEG): container finished" podID="e226d75a-7125-4132-816c-7983ef6b1532" containerID="6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984" exitCode=0 Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.738023 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerDied","Data":"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984"} Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.738062 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4xfq9" event={"ID":"e226d75a-7125-4132-816c-7983ef6b1532","Type":"ContainerDied","Data":"52f82ea2c926ea65c8c70dc11039aef4ac80d076384d95a21137fdf61bff12fd"} Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.738104 4764 scope.go:117] "RemoveContainer" containerID="6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.738393 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4xfq9" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.787030 4764 scope.go:117] "RemoveContainer" containerID="71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.806795 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.813071 4764 scope.go:117] "RemoveContainer" containerID="a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.823207 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4xfq9"] Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.897320 4764 scope.go:117] "RemoveContainer" containerID="6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984" Dec 04 02:22:39 crc kubenswrapper[4764]: E1204 02:22:39.898041 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984\": container with ID starting with 6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984 not found: ID does not exist" containerID="6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.898127 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984"} err="failed to get container status \"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984\": rpc error: code = NotFound desc = could not find container \"6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984\": container with ID starting with 6195ea37badf795e60ea0ef56c03534fa8e613f9da45ddc1661087c7e00f9984 not found: ID does not exist" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.898211 4764 scope.go:117] "RemoveContainer" containerID="71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e" Dec 04 02:22:39 crc kubenswrapper[4764]: E1204 02:22:39.900076 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e\": container with ID starting with 71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e not found: ID does not exist" containerID="71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.900102 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e"} err="failed to get container status \"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e\": rpc error: code = NotFound desc = could not find container \"71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e\": container with ID starting with 71e4242a3e83c0bf625a789e52c91d46d2c8c59ede9e7aa06c1872c8d857a75e not found: ID does not exist" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.900117 4764 scope.go:117] "RemoveContainer" containerID="a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54" Dec 04 02:22:39 crc kubenswrapper[4764]: E1204 02:22:39.900448 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54\": container with ID starting with a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54 not found: ID does not exist" containerID="a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54" Dec 04 02:22:39 crc kubenswrapper[4764]: I1204 02:22:39.900468 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54"} err="failed to get container status \"a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54\": rpc error: code = NotFound desc = could not find container \"a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54\": container with ID starting with a8bde11b73fff3380d8c3f15113e5e7b81364d953ef6029794d47b974d429b54 not found: ID does not exist" Dec 04 02:22:40 crc kubenswrapper[4764]: I1204 02:22:40.559814 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e226d75a-7125-4132-816c-7983ef6b1532" path="/var/lib/kubelet/pods/e226d75a-7125-4132-816c-7983ef6b1532/volumes" Dec 04 02:22:46 crc kubenswrapper[4764]: I1204 02:22:46.547304 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:46 crc kubenswrapper[4764]: E1204 02:22:46.548148 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:22:59 crc kubenswrapper[4764]: I1204 02:22:59.547189 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:22:59 crc kubenswrapper[4764]: E1204 02:22:59.548301 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:23:13 crc kubenswrapper[4764]: I1204 02:23:13.546042 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:23:13 crc kubenswrapper[4764]: E1204 02:23:13.546778 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.173654 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q6sff/must-gather-9dkzh"] Dec 04 02:23:19 crc kubenswrapper[4764]: E1204 02:23:19.174915 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="extract-content" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.174934 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="extract-content" Dec 04 02:23:19 crc kubenswrapper[4764]: E1204 02:23:19.174968 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="registry-server" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.174978 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="registry-server" Dec 04 02:23:19 crc kubenswrapper[4764]: E1204 02:23:19.174994 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="extract-utilities" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.175004 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="extract-utilities" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.175286 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="e226d75a-7125-4132-816c-7983ef6b1532" containerName="registry-server" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.176867 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.184530 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-q6sff"/"default-dockercfg-zchbg" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.184701 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q6sff"/"kube-root-ca.crt" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.184823 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-q6sff"/"openshift-service-ca.crt" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.203148 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q6sff/must-gather-9dkzh"] Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.347915 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.347970 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf6t8\" (UniqueName: \"kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.450505 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.450547 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf6t8\" (UniqueName: \"kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.451769 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.490548 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf6t8\" (UniqueName: \"kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8\") pod \"must-gather-9dkzh\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:19 crc kubenswrapper[4764]: I1204 02:23:19.505276 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:23:20 crc kubenswrapper[4764]: W1204 02:23:20.000399 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fb45c90_358b_40d5_9554_c3f6c445ec83.slice/crio-49012f3222099349d46a59f66e24714b7d97051c1c95bb85a6b5686a583903e4 WatchSource:0}: Error finding container 49012f3222099349d46a59f66e24714b7d97051c1c95bb85a6b5686a583903e4: Status 404 returned error can't find the container with id 49012f3222099349d46a59f66e24714b7d97051c1c95bb85a6b5686a583903e4 Dec 04 02:23:20 crc kubenswrapper[4764]: I1204 02:23:20.003786 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-q6sff/must-gather-9dkzh"] Dec 04 02:23:20 crc kubenswrapper[4764]: I1204 02:23:20.264852 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/must-gather-9dkzh" event={"ID":"9fb45c90-358b-40d5-9554-c3f6c445ec83","Type":"ContainerStarted","Data":"49012f3222099349d46a59f66e24714b7d97051c1c95bb85a6b5686a583903e4"} Dec 04 02:23:25 crc kubenswrapper[4764]: I1204 02:23:25.330994 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/must-gather-9dkzh" event={"ID":"9fb45c90-358b-40d5-9554-c3f6c445ec83","Type":"ContainerStarted","Data":"0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416"} Dec 04 02:23:25 crc kubenswrapper[4764]: I1204 02:23:25.331687 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/must-gather-9dkzh" event={"ID":"9fb45c90-358b-40d5-9554-c3f6c445ec83","Type":"ContainerStarted","Data":"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51"} Dec 04 02:23:25 crc kubenswrapper[4764]: I1204 02:23:25.377541 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q6sff/must-gather-9dkzh" podStartSLOduration=2.148911329 podStartE2EDuration="6.377524819s" podCreationTimestamp="2025-12-04 02:23:19 +0000 UTC" firstStartedPulling="2025-12-04 02:23:20.004026271 +0000 UTC m=+9735.765350692" lastFinishedPulling="2025-12-04 02:23:24.232639771 +0000 UTC m=+9739.993964182" observedRunningTime="2025-12-04 02:23:25.362011818 +0000 UTC m=+9741.123336239" watchObservedRunningTime="2025-12-04 02:23:25.377524819 +0000 UTC m=+9741.138849230" Dec 04 02:23:27 crc kubenswrapper[4764]: I1204 02:23:27.546755 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:23:27 crc kubenswrapper[4764]: E1204 02:23:27.547546 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:23:27 crc kubenswrapper[4764]: E1204 02:23:27.813749 4764 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.13:60368->38.102.83.13:39483: read tcp 38.102.83.13:60368->38.102.83.13:39483: read: connection reset by peer Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.236185 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q6sff/crc-debug-phgwj"] Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.238651 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.401083 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqkm4\" (UniqueName: \"kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.401613 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.503312 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.503445 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqkm4\" (UniqueName: \"kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.504086 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.522045 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqkm4\" (UniqueName: \"kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4\") pod \"crc-debug-phgwj\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: I1204 02:23:29.559258 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:29 crc kubenswrapper[4764]: W1204 02:23:29.598912 4764 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc489285c_aed7_4bbd_8e09_6f37dc3410f0.slice/crio-a92d1525d223b491f1caa2af69cedb4741c4203ffbc3da36d1c6d0b14858ba92 WatchSource:0}: Error finding container a92d1525d223b491f1caa2af69cedb4741c4203ffbc3da36d1c6d0b14858ba92: Status 404 returned error can't find the container with id a92d1525d223b491f1caa2af69cedb4741c4203ffbc3da36d1c6d0b14858ba92 Dec 04 02:23:30 crc kubenswrapper[4764]: I1204 02:23:30.410328 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/crc-debug-phgwj" event={"ID":"c489285c-aed7-4bbd-8e09-6f37dc3410f0","Type":"ContainerStarted","Data":"a92d1525d223b491f1caa2af69cedb4741c4203ffbc3da36d1c6d0b14858ba92"} Dec 04 02:23:38 crc kubenswrapper[4764]: I1204 02:23:38.565804 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:23:38 crc kubenswrapper[4764]: E1204 02:23:38.566632 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:23:41 crc kubenswrapper[4764]: I1204 02:23:41.626613 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/crc-debug-phgwj" event={"ID":"c489285c-aed7-4bbd-8e09-6f37dc3410f0","Type":"ContainerStarted","Data":"da38f0e06538c04d15d6ecd2c145d673a6ef2543d5037bd0960e2cfe9ae596ec"} Dec 04 02:23:41 crc kubenswrapper[4764]: I1204 02:23:41.645086 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-q6sff/crc-debug-phgwj" podStartSLOduration=1.163832069 podStartE2EDuration="12.645067874s" podCreationTimestamp="2025-12-04 02:23:29 +0000 UTC" firstStartedPulling="2025-12-04 02:23:29.601565956 +0000 UTC m=+9745.362890377" lastFinishedPulling="2025-12-04 02:23:41.082801771 +0000 UTC m=+9756.844126182" observedRunningTime="2025-12-04 02:23:41.636619897 +0000 UTC m=+9757.397944298" watchObservedRunningTime="2025-12-04 02:23:41.645067874 +0000 UTC m=+9757.406392285" Dec 04 02:23:50 crc kubenswrapper[4764]: I1204 02:23:50.545907 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:23:50 crc kubenswrapper[4764]: E1204 02:23:50.546748 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:23:56 crc kubenswrapper[4764]: I1204 02:23:56.774132 4764 generic.go:334] "Generic (PLEG): container finished" podID="c489285c-aed7-4bbd-8e09-6f37dc3410f0" containerID="da38f0e06538c04d15d6ecd2c145d673a6ef2543d5037bd0960e2cfe9ae596ec" exitCode=0 Dec 04 02:23:56 crc kubenswrapper[4764]: I1204 02:23:56.774615 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/crc-debug-phgwj" event={"ID":"c489285c-aed7-4bbd-8e09-6f37dc3410f0","Type":"ContainerDied","Data":"da38f0e06538c04d15d6ecd2c145d673a6ef2543d5037bd0960e2cfe9ae596ec"} Dec 04 02:23:57 crc kubenswrapper[4764]: I1204 02:23:57.903090 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:57 crc kubenswrapper[4764]: I1204 02:23:57.946727 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q6sff/crc-debug-phgwj"] Dec 04 02:23:57 crc kubenswrapper[4764]: I1204 02:23:57.955675 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q6sff/crc-debug-phgwj"] Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.035475 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host" (OuterVolumeSpecName: "host") pod "c489285c-aed7-4bbd-8e09-6f37dc3410f0" (UID: "c489285c-aed7-4bbd-8e09-6f37dc3410f0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.037427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host\") pod \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.037816 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqkm4\" (UniqueName: \"kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4\") pod \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\" (UID: \"c489285c-aed7-4bbd-8e09-6f37dc3410f0\") " Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.039041 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c489285c-aed7-4bbd-8e09-6f37dc3410f0-host\") on node \"crc\" DevicePath \"\"" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.050947 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4" (OuterVolumeSpecName: "kube-api-access-qqkm4") pod "c489285c-aed7-4bbd-8e09-6f37dc3410f0" (UID: "c489285c-aed7-4bbd-8e09-6f37dc3410f0"). InnerVolumeSpecName "kube-api-access-qqkm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.140836 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqkm4\" (UniqueName: \"kubernetes.io/projected/c489285c-aed7-4bbd-8e09-6f37dc3410f0-kube-api-access-qqkm4\") on node \"crc\" DevicePath \"\"" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.558835 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c489285c-aed7-4bbd-8e09-6f37dc3410f0" path="/var/lib/kubelet/pods/c489285c-aed7-4bbd-8e09-6f37dc3410f0/volumes" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.791555 4764 scope.go:117] "RemoveContainer" containerID="da38f0e06538c04d15d6ecd2c145d673a6ef2543d5037bd0960e2cfe9ae596ec" Dec 04 02:23:58 crc kubenswrapper[4764]: I1204 02:23:58.791658 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-phgwj" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.146259 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-q6sff/crc-debug-5wm2v"] Dec 04 02:23:59 crc kubenswrapper[4764]: E1204 02:23:59.146690 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c489285c-aed7-4bbd-8e09-6f37dc3410f0" containerName="container-00" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.146704 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c489285c-aed7-4bbd-8e09-6f37dc3410f0" containerName="container-00" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.147258 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c489285c-aed7-4bbd-8e09-6f37dc3410f0" containerName="container-00" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.148007 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.260893 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzj5s\" (UniqueName: \"kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.261204 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.363349 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzj5s\" (UniqueName: \"kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.363452 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.363617 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.386550 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzj5s\" (UniqueName: \"kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s\") pod \"crc-debug-5wm2v\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.462790 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:23:59 crc kubenswrapper[4764]: I1204 02:23:59.801227 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" event={"ID":"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9","Type":"ContainerStarted","Data":"fb73c584d264059bcb9be27d87f3de6806e0d2e2c2d06bc0113cf1381edd2927"} Dec 04 02:24:00 crc kubenswrapper[4764]: I1204 02:24:00.813394 4764 generic.go:334] "Generic (PLEG): container finished" podID="4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" containerID="f162d12cac7b4966953e4a023868fa3c3d116ca6294104a8cf334bd0577fad39" exitCode=1 Dec 04 02:24:00 crc kubenswrapper[4764]: I1204 02:24:00.813471 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" event={"ID":"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9","Type":"ContainerDied","Data":"f162d12cac7b4966953e4a023868fa3c3d116ca6294104a8cf334bd0577fad39"} Dec 04 02:24:00 crc kubenswrapper[4764]: I1204 02:24:00.852645 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q6sff/crc-debug-5wm2v"] Dec 04 02:24:00 crc kubenswrapper[4764]: I1204 02:24:00.864879 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q6sff/crc-debug-5wm2v"] Dec 04 02:24:01 crc kubenswrapper[4764]: I1204 02:24:01.946508 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.048581 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzj5s\" (UniqueName: \"kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s\") pod \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.048652 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host\") pod \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\" (UID: \"4f52ffd5-fdbd-47a7-b7b7-6184298a60d9\") " Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.048798 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host" (OuterVolumeSpecName: "host") pod "4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" (UID: "4f52ffd5-fdbd-47a7-b7b7-6184298a60d9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.049098 4764 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-host\") on node \"crc\" DevicePath \"\"" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.062471 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s" (OuterVolumeSpecName: "kube-api-access-qzj5s") pod "4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" (UID: "4f52ffd5-fdbd-47a7-b7b7-6184298a60d9"). InnerVolumeSpecName "kube-api-access-qzj5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.150668 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzj5s\" (UniqueName: \"kubernetes.io/projected/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9-kube-api-access-qzj5s\") on node \"crc\" DevicePath \"\"" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.557867 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" path="/var/lib/kubelet/pods/4f52ffd5-fdbd-47a7-b7b7-6184298a60d9/volumes" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.839980 4764 scope.go:117] "RemoveContainer" containerID="f162d12cac7b4966953e4a023868fa3c3d116ca6294104a8cf334bd0577fad39" Dec 04 02:24:02 crc kubenswrapper[4764]: I1204 02:24:02.840064 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/crc-debug-5wm2v" Dec 04 02:24:04 crc kubenswrapper[4764]: I1204 02:24:04.557432 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:24:04 crc kubenswrapper[4764]: E1204 02:24:04.558132 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:24:17 crc kubenswrapper[4764]: I1204 02:24:17.546690 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:24:17 crc kubenswrapper[4764]: E1204 02:24:17.547584 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:24:31 crc kubenswrapper[4764]: I1204 02:24:31.546373 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:24:31 crc kubenswrapper[4764]: E1204 02:24:31.547438 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:24:42 crc kubenswrapper[4764]: I1204 02:24:42.552114 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:24:42 crc kubenswrapper[4764]: E1204 02:24:42.552990 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:24:56 crc kubenswrapper[4764]: I1204 02:24:56.546695 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:24:56 crc kubenswrapper[4764]: E1204 02:24:56.547757 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:25:11 crc kubenswrapper[4764]: I1204 02:25:11.547740 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:25:11 crc kubenswrapper[4764]: E1204 02:25:11.548453 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:25:26 crc kubenswrapper[4764]: I1204 02:25:26.546404 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:25:26 crc kubenswrapper[4764]: E1204 02:25:26.547597 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.349821 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:32 crc kubenswrapper[4764]: E1204 02:25:32.351415 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" containerName="container-00" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.351445 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" containerName="container-00" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.351934 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f52ffd5-fdbd-47a7-b7b7-6184298a60d9" containerName="container-00" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.355582 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.365801 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.475232 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.475420 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snxg5\" (UniqueName: \"kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.475447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.577786 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.577851 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.578009 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snxg5\" (UniqueName: \"kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.578285 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.578596 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.603570 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snxg5\" (UniqueName: \"kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5\") pod \"redhat-marketplace-pz4lx\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:32 crc kubenswrapper[4764]: I1204 02:25:32.682625 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:33 crc kubenswrapper[4764]: I1204 02:25:33.223855 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:33 crc kubenswrapper[4764]: I1204 02:25:33.939658 4764 generic.go:334] "Generic (PLEG): container finished" podID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerID="0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840" exitCode=0 Dec 04 02:25:33 crc kubenswrapper[4764]: I1204 02:25:33.939892 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerDied","Data":"0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840"} Dec 04 02:25:33 crc kubenswrapper[4764]: I1204 02:25:33.940099 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerStarted","Data":"057b32477b7c153656600284cb3a1f247c25c31c097a9e07e5647b9be4e75c2c"} Dec 04 02:25:35 crc kubenswrapper[4764]: I1204 02:25:35.969299 4764 generic.go:334] "Generic (PLEG): container finished" podID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerID="0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b" exitCode=0 Dec 04 02:25:35 crc kubenswrapper[4764]: I1204 02:25:35.969388 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerDied","Data":"0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b"} Dec 04 02:25:36 crc kubenswrapper[4764]: I1204 02:25:36.984086 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerStarted","Data":"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda"} Dec 04 02:25:37 crc kubenswrapper[4764]: I1204 02:25:37.003819 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pz4lx" podStartSLOduration=2.543045474 podStartE2EDuration="5.00380067s" podCreationTimestamp="2025-12-04 02:25:32 +0000 UTC" firstStartedPulling="2025-12-04 02:25:33.942758885 +0000 UTC m=+9869.704083326" lastFinishedPulling="2025-12-04 02:25:36.403514071 +0000 UTC m=+9872.164838522" observedRunningTime="2025-12-04 02:25:36.999671008 +0000 UTC m=+9872.760995419" watchObservedRunningTime="2025-12-04 02:25:37.00380067 +0000 UTC m=+9872.765125081" Dec 04 02:25:39 crc kubenswrapper[4764]: I1204 02:25:39.546846 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:25:39 crc kubenswrapper[4764]: E1204 02:25:39.547914 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:25:42 crc kubenswrapper[4764]: I1204 02:25:42.683499 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:42 crc kubenswrapper[4764]: I1204 02:25:42.683836 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:42 crc kubenswrapper[4764]: I1204 02:25:42.738437 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:43 crc kubenswrapper[4764]: I1204 02:25:43.136338 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:43 crc kubenswrapper[4764]: I1204 02:25:43.199626 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.073914 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pz4lx" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="registry-server" containerID="cri-o://cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda" gracePeriod=2 Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.636812 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.684058 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities\") pod \"65122f14-7baf-4eb0-ada8-ac234b6dda16\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.684137 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snxg5\" (UniqueName: \"kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5\") pod \"65122f14-7baf-4eb0-ada8-ac234b6dda16\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.684244 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content\") pod \"65122f14-7baf-4eb0-ada8-ac234b6dda16\" (UID: \"65122f14-7baf-4eb0-ada8-ac234b6dda16\") " Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.691850 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities" (OuterVolumeSpecName: "utilities") pod "65122f14-7baf-4eb0-ada8-ac234b6dda16" (UID: "65122f14-7baf-4eb0-ada8-ac234b6dda16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.701221 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5" (OuterVolumeSpecName: "kube-api-access-snxg5") pod "65122f14-7baf-4eb0-ada8-ac234b6dda16" (UID: "65122f14-7baf-4eb0-ada8-ac234b6dda16"). InnerVolumeSpecName "kube-api-access-snxg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.733458 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65122f14-7baf-4eb0-ada8-ac234b6dda16" (UID: "65122f14-7baf-4eb0-ada8-ac234b6dda16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.787074 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.787306 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snxg5\" (UniqueName: \"kubernetes.io/projected/65122f14-7baf-4eb0-ada8-ac234b6dda16-kube-api-access-snxg5\") on node \"crc\" DevicePath \"\"" Dec 04 02:25:45 crc kubenswrapper[4764]: I1204 02:25:45.787394 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65122f14-7baf-4eb0-ada8-ac234b6dda16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.088974 4764 generic.go:334] "Generic (PLEG): container finished" podID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerID="cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda" exitCode=0 Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.089049 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerDied","Data":"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda"} Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.089082 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz4lx" event={"ID":"65122f14-7baf-4eb0-ada8-ac234b6dda16","Type":"ContainerDied","Data":"057b32477b7c153656600284cb3a1f247c25c31c097a9e07e5647b9be4e75c2c"} Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.089104 4764 scope.go:117] "RemoveContainer" containerID="cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.089159 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz4lx" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.142984 4764 scope.go:117] "RemoveContainer" containerID="0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.154917 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.177965 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz4lx"] Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.179833 4764 scope.go:117] "RemoveContainer" containerID="0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.219628 4764 scope.go:117] "RemoveContainer" containerID="cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda" Dec 04 02:25:46 crc kubenswrapper[4764]: E1204 02:25:46.221659 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda\": container with ID starting with cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda not found: ID does not exist" containerID="cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.221739 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda"} err="failed to get container status \"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda\": rpc error: code = NotFound desc = could not find container \"cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda\": container with ID starting with cc14c6a511f7949436a36c3436b3fe37fccd2f776e7268e5734d9faecaf11fda not found: ID does not exist" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.221769 4764 scope.go:117] "RemoveContainer" containerID="0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b" Dec 04 02:25:46 crc kubenswrapper[4764]: E1204 02:25:46.222138 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b\": container with ID starting with 0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b not found: ID does not exist" containerID="0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.222184 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b"} err="failed to get container status \"0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b\": rpc error: code = NotFound desc = could not find container \"0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b\": container with ID starting with 0af9d67d5424aa65efdd61023002133339654f42ddc30dda43766938fee76f1b not found: ID does not exist" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.222219 4764 scope.go:117] "RemoveContainer" containerID="0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840" Dec 04 02:25:46 crc kubenswrapper[4764]: E1204 02:25:46.222896 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840\": container with ID starting with 0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840 not found: ID does not exist" containerID="0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.222935 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840"} err="failed to get container status \"0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840\": rpc error: code = NotFound desc = could not find container \"0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840\": container with ID starting with 0fb54ea1826bf1abca4a2f16d355622dbdece4dee5df3e28894a8b9016ffe840 not found: ID does not exist" Dec 04 02:25:46 crc kubenswrapper[4764]: I1204 02:25:46.559322 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" path="/var/lib/kubelet/pods/65122f14-7baf-4eb0-ada8-ac234b6dda16/volumes" Dec 04 02:25:50 crc kubenswrapper[4764]: I1204 02:25:50.546455 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:25:50 crc kubenswrapper[4764]: E1204 02:25:50.547557 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:26:04 crc kubenswrapper[4764]: I1204 02:26:04.554417 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:26:04 crc kubenswrapper[4764]: E1204 02:26:04.555553 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:26:19 crc kubenswrapper[4764]: I1204 02:26:19.548531 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:26:19 crc kubenswrapper[4764]: E1204 02:26:19.549319 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:26:34 crc kubenswrapper[4764]: I1204 02:26:34.553889 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:26:35 crc kubenswrapper[4764]: I1204 02:26:35.731790 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b"} Dec 04 02:27:14 crc kubenswrapper[4764]: I1204 02:27:14.566661 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_97e1e583-be68-470d-a9fd-b8bc27831cd4/init-config-reloader/0.log" Dec 04 02:27:14 crc kubenswrapper[4764]: I1204 02:27:14.850123 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_97e1e583-be68-470d-a9fd-b8bc27831cd4/alertmanager/0.log" Dec 04 02:27:14 crc kubenswrapper[4764]: I1204 02:27:14.908272 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_97e1e583-be68-470d-a9fd-b8bc27831cd4/init-config-reloader/0.log" Dec 04 02:27:14 crc kubenswrapper[4764]: I1204 02:27:14.939272 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_97e1e583-be68-470d-a9fd-b8bc27831cd4/config-reloader/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.085666 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_8af18f38-79b2-4deb-b9ff-11aa9eccf479/aodh-api/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.166832 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_8af18f38-79b2-4deb-b9ff-11aa9eccf479/aodh-evaluator/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.169616 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_8af18f38-79b2-4deb-b9ff-11aa9eccf479/aodh-listener/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.316655 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_8af18f38-79b2-4deb-b9ff-11aa9eccf479/aodh-notifier/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.373658 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57d69cf6fb-qrpxb_c79a42ff-9710-4e85-9572-b5ef52f182c9/barbican-api/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.409433 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57d69cf6fb-qrpxb_c79a42ff-9710-4e85-9572-b5ef52f182c9/barbican-api-log/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.582600 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d646cdddb-jwkg4_6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2/barbican-keystone-listener/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.610096 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d646cdddb-jwkg4_6684ffcd-bdd3-4b3f-98fd-1f2adf292cf2/barbican-keystone-listener-log/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.782560 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-686884878f-7crbh_623ac352-8e8d-4baa-b526-01cdaa99302b/barbican-worker/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.843257 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-686884878f-7crbh_623ac352-8e8d-4baa-b526-01cdaa99302b/barbican-worker-log/0.log" Dec 04 02:27:15 crc kubenswrapper[4764]: I1204 02:27:15.950436 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-5h7lq_29cfe6b3-652e-48f3-974f-4b3cbe3815d4/bootstrap-openstack-openstack-cell1/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.032508 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_433a26a8-5ca9-408a-a45c-3ab1326328df/ceilometer-central-agent/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.167645 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_433a26a8-5ca9-408a-a45c-3ab1326328df/ceilometer-notification-agent/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.200767 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_433a26a8-5ca9-408a-a45c-3ab1326328df/proxy-httpd/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.336232 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_433a26a8-5ca9-408a-a45c-3ab1326328df/sg-core/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.371845 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-95f2v_4d57a9e3-2efc-449f-8d83-5b42ce3642c1/ceph-client-openstack-openstack-cell1/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.821414 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_718b4d98-d5da-4f9b-802c-b00afbdd9593/cinder-api/0.log" Dec 04 02:27:16 crc kubenswrapper[4764]: I1204 02:27:16.866844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_718b4d98-d5da-4f9b-802c-b00afbdd9593/cinder-api-log/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.055127 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_04f3ee5a-109b-49df-9264-c6bb556af4bc/cinder-backup/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.075339 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_04f3ee5a-109b-49df-9264-c6bb556af4bc/probe/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.133031 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fb066f4d-c699-4548-8b39-85cc5a83211c/cinder-scheduler/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.310916 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fb066f4d-c699-4548-8b39-85cc5a83211c/probe/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.473479 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81e8ae86-9710-408e-83e0-8c50d1c53ce1/probe/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.502527 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_81e8ae86-9710-408e-83e0-8c50d1c53ce1/cinder-volume/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.578608 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-lctlv_a9087c3e-7d6e-4590-a64c-3f9b5d3826fb/configure-network-openstack-openstack-cell1/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.696165 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-brgk5_a1e11990-8f5b-4b68-b569-71c8be08628d/configure-os-openstack-openstack-cell1/0.log" Dec 04 02:27:17 crc kubenswrapper[4764]: I1204 02:27:17.818708 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5df9d8c4b7-vnhgm_1d7015a3-19f2-4c8c-aba4-add826c42a61/init/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.009824 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-mrftr_1b9e5f50-30a4-4d90-bc88-4971bcc8740a/download-cache-openstack-openstack-cell1/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.025383 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5df9d8c4b7-vnhgm_1d7015a3-19f2-4c8c-aba4-add826c42a61/init/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.052370 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5df9d8c4b7-vnhgm_1d7015a3-19f2-4c8c-aba4-add826c42a61/dnsmasq-dns/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.203245 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f231d621-5c72-4559-aad3-392c8bcba6e1/glance-httpd/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.229626 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f231d621-5c72-4559-aad3-392c8bcba6e1/glance-log/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.269593 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ad61f840-5767-444f-9fe8-12d36e3d1582/glance-httpd/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.323945 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_ad61f840-5767-444f-9fe8-12d36e3d1582/glance-log/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.590175 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-667b97b7d7-srfqv_07c60c6a-fb5d-4c32-9b00-fcbd51e2c9a9/heat-api/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.675503 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-f87fbcd57-sf6pr_fd51fc76-66f3-4cda-9906-631301e6e3c1/heat-engine/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.681733 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6f4478855c-4tplv_20544f5c-3377-485d-8170-d28325a9f913/heat-cfnapi/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.921250 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-75zbq_1c4de713-de0e-456d-9015-b2997b2ab3e1/install-certs-openstack-openstack-cell1/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.945015 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67554b9ccc-vdgrl_c9560a14-b532-45fb-943d-20a22e210b3f/horizon-log/0.log" Dec 04 02:27:18 crc kubenswrapper[4764]: I1204 02:27:18.955868 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-67554b9ccc-vdgrl_c9560a14-b532-45fb-943d-20a22e210b3f/horizon/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.111124 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-nfbh6_9b5c6bdf-2afc-44dd-bd15-055fa374edc4/install-os-openstack-openstack-cell1/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.157535 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29413561-s55gq_22911e90-422c-404d-8b0a-1e1fdd0f731a/keystone-cron/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.400110 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-dd9cdf66d-j2gg2_c9f24c8f-e68f-4397-8c51-94d78d3cbe83/keystone-api/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.413667 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9e25921f-2ca3-4360-a344-fe779ac2ac52/kube-state-metrics/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.505422 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-9fc77_e419a2a6-f35a-4620-9014-eaefccaf150e/libvirt-openstack-openstack-cell1/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.631803 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ec1ff801-5a0d-4a01-9366-1b0355a19ca0/manila-api-log/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.676567 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_ec1ff801-5a0d-4a01-9366-1b0355a19ca0/manila-api/0.log" Dec 04 02:27:19 crc kubenswrapper[4764]: I1204 02:27:19.816091 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dc9bdece-1ce0-4895-aaca-df458215eed1/manila-scheduler/0.log" Dec 04 02:27:20 crc kubenswrapper[4764]: I1204 02:27:20.143901 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_aec8203f-7269-4248-bf06-a696028aba5c/manila-share/0.log" Dec 04 02:27:20 crc kubenswrapper[4764]: I1204 02:27:20.161490 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_dc9bdece-1ce0-4895-aaca-df458215eed1/probe/0.log" Dec 04 02:27:20 crc kubenswrapper[4764]: I1204 02:27:20.164658 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_aec8203f-7269-4248-bf06-a696028aba5c/probe/0.log" Dec 04 02:27:20 crc kubenswrapper[4764]: I1204 02:27:20.482679 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54f47ffb59-682nm_978cd232-08ca-459a-91b9-a6c1a27ad58e/neutron-httpd/0.log" Dec 04 02:27:20 crc kubenswrapper[4764]: I1204 02:27:20.584870 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-54f47ffb59-682nm_978cd232-08ca-459a-91b9-a6c1a27ad58e/neutron-api/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.261248 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-nj9tw_054cf6c8-5222-4ade-a2f3-53aebec044b4/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.266952 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-qx7bp_e2b5c139-4e32-4f95-b7a9-9e33f4d5a1d3/neutron-metadata-openstack-openstack-cell1/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.517045 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-crnht_b8f75c01-301b-43f4-9d15-ad19080d1ba9/neutron-sriov-openstack-openstack-cell1/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.624435 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9d5f8cf6-c3c7-4e35-8690-addc575935e9/nova-api-api/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.685537 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_9d5f8cf6-c3c7-4e35-8690-addc575935e9/nova-api-log/0.log" Dec 04 02:27:21 crc kubenswrapper[4764]: I1204 02:27:21.854290 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_35a6efa2-f727-4605-8a99-4b8fc183e2cb/nova-cell0-conductor-conductor/0.log" Dec 04 02:27:22 crc kubenswrapper[4764]: I1204 02:27:22.038836 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_524a45cb-f83d-4c0b-b05e-76399a5224eb/nova-cell1-conductor-conductor/0.log" Dec 04 02:27:22 crc kubenswrapper[4764]: I1204 02:27:22.167822 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d816b6df-2de6-4e61-9612-613ec427bd48/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 02:27:22 crc kubenswrapper[4764]: I1204 02:27:22.485691 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell6d54m_69433a1d-a420-4643-9654-ceb18ac6556b/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 04 02:27:22 crc kubenswrapper[4764]: I1204 02:27:22.541674 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-rbc64_079d074f-7a44-4c65-98ca-68e216036454/nova-cell1-openstack-openstack-cell1/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.134620 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6ea57de4-69ee-4328-a251-b2cd08c64c7b/nova-metadata-metadata/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.231141 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6ea57de4-69ee-4328-a251-b2cd08c64c7b/nova-metadata-log/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.292264 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_fa9be15b-1720-4d5f-8bab-13aa095347e9/nova-scheduler-scheduler/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.475991 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-58457b4d67-ncwfc_8703e4e1-060c-47a1-b9ce-99de3d89fe80/init/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.686266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-58457b4d67-ncwfc_8703e4e1-060c-47a1-b9ce-99de3d89fe80/init/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.893195 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-58457b4d67-ncwfc_8703e4e1-060c-47a1-b9ce-99de3d89fe80/octavia-api-provider-agent/0.log" Dec 04 02:27:23 crc kubenswrapper[4764]: I1204 02:27:23.945471 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-58457b4d67-ncwfc_8703e4e1-060c-47a1-b9ce-99de3d89fe80/octavia-api/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.194501 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-q9k9p_fdd7c41d-eb6c-4fdc-8e37-a31282572e7d/init/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.388590 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-q9k9p_fdd7c41d-eb6c-4fdc-8e37-a31282572e7d/init/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.421534 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-hfsz5_15380bf0-c6f1-47bd-9fe1-9062df838464/init/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.617214 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-hfsz5_15380bf0-c6f1-47bd-9fe1-9062df838464/init/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.626114 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-q9k9p_fdd7c41d-eb6c-4fdc-8e37-a31282572e7d/octavia-healthmanager/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.636515 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-hfsz5_15380bf0-c6f1-47bd-9fe1-9062df838464/octavia-housekeeping/0.log" Dec 04 02:27:24 crc kubenswrapper[4764]: I1204 02:27:24.840994 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-5tszs_097194b1-f646-4a43-b58c-e2bfb03da583/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.111973 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-5tszs_097194b1-f646-4a43-b58c-e2bfb03da583/octavia-amphora-httpd/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.135164 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-56c9f55b99-5tszs_097194b1-f646-4a43-b58c-e2bfb03da583/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.192854 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wqtrg_becbf2e8-4899-4a6e-893c-84ffd4617c27/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.349855 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wqtrg_becbf2e8-4899-4a6e-893c-84ffd4617c27/octavia-rsyslog/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.350584 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-wqtrg_becbf2e8-4899-4a6e-893c-84ffd4617c27/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.427322 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-cms8z_bc704469-633b-462f-8b15-974b0d822837/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.644893 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-cms8z_bc704469-633b-462f-8b15-974b0d822837/init/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.761749 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fecb71e-e1c0-490f-97f7-b58e4a3f7c01/mysql-bootstrap/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.866115 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-cms8z_bc704469-633b-462f-8b15-974b0d822837/octavia-worker/0.log" Dec 04 02:27:25 crc kubenswrapper[4764]: I1204 02:27:25.927016 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fecb71e-e1c0-490f-97f7-b58e4a3f7c01/mysql-bootstrap/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.008909 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_8fecb71e-e1c0-490f-97f7-b58e4a3f7c01/galera/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.095226 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b1bbbf7-6f68-4a5a-897d-cfd533433b5a/mysql-bootstrap/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.300369 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b1bbbf7-6f68-4a5a-897d-cfd533433b5a/mysql-bootstrap/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.361489 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_dc6364bf-30dd-48f6-813a-bc1ece8188a4/openstackclient/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.397779 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_8b1bbbf7-6f68-4a5a-897d-cfd533433b5a/galera/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.616543 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-fn9rg_3444b973-3337-4806-a444-4749df6c6fe9/openstack-network-exporter/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.679921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-k6vxt_462adcd9-211b-4d5f-9ebc-4289708c9ee9/ovn-controller/0.log" Dec 04 02:27:26 crc kubenswrapper[4764]: I1204 02:27:26.844410 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tcll6_af09cb3a-3fd3-47c6-ba05-a79b0c66efac/ovsdb-server-init/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.101243 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tcll6_af09cb3a-3fd3-47c6-ba05-a79b0c66efac/ovs-vswitchd/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.103479 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tcll6_af09cb3a-3fd3-47c6-ba05-a79b0c66efac/ovsdb-server-init/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.135647 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tcll6_af09cb3a-3fd3-47c6-ba05-a79b0c66efac/ovsdb-server/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.349533 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22754f83-6ba0-48f6-82f7-5a28b5c5498c/openstack-network-exporter/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.357844 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_22754f83-6ba0-48f6-82f7-5a28b5c5498c/ovn-northd/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.569554 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f95fbeb7-559b-43f0-9c44-6462f6db2e42/openstack-network-exporter/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.668591 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f95fbeb7-559b-43f0-9c44-6462f6db2e42/ovsdbserver-nb/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.668818 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-g5wgp_1b4ad484-2065-4ff3-9c95-30391bbec966/ovn-openstack-openstack-cell1/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.886863 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7f86b426-5476-4395-a2f4-7b9ba3dead52/openstack-network-exporter/0.log" Dec 04 02:27:27 crc kubenswrapper[4764]: I1204 02:27:27.907783 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_7f86b426-5476-4395-a2f4-7b9ba3dead52/ovsdbserver-nb/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.112585 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_211073e0-a5c0-49b8-a4e2-524448c0f91c/openstack-network-exporter/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.180583 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_211073e0-a5c0-49b8-a4e2-524448c0f91c/ovsdbserver-nb/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.375798 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1a9830a9-8b4b-443e-bca0-4b70d281dfb9/openstack-network-exporter/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.581859 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1a9830a9-8b4b-443e-bca0-4b70d281dfb9/ovsdbserver-sb/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.765513 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_0189e999-3d6d-43b6-9c0c-2afa24e5b6c0/openstack-network-exporter/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.839418 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_0189e999-3d6d-43b6-9c0c-2afa24e5b6c0/ovsdbserver-sb/0.log" Dec 04 02:27:28 crc kubenswrapper[4764]: I1204 02:27:28.929446 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b37142e0-8801-4951-ab52-0a6a553b23ff/openstack-network-exporter/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.069588 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_b37142e0-8801-4951-ab52-0a6a553b23ff/ovsdbserver-sb/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.169624 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54c9bb54b6-85qmh_7847c079-88f3-4ae0-a4b9-666c82f8b8a6/placement-api/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.263388 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54c9bb54b6-85qmh_7847c079-88f3-4ae0-a4b9-666c82f8b8a6/placement-log/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.365461 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cvbfwg_3da912a0-fc02-4542-928b-e77f1fc9367b/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.480581 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_91dceff4-dc61-4803-98fb-da530493e50c/init-config-reloader/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.650420 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_91dceff4-dc61-4803-98fb-da530493e50c/thanos-sidecar/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.698298 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_91dceff4-dc61-4803-98fb-da530493e50c/config-reloader/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.701124 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_91dceff4-dc61-4803-98fb-da530493e50c/prometheus/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.704473 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_91dceff4-dc61-4803-98fb-da530493e50c/init-config-reloader/0.log" Dec 04 02:27:29 crc kubenswrapper[4764]: I1204 02:27:29.906095 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62f1b02a-92d6-496e-b922-2de70fae0f9a/setup-container/0.log" Dec 04 02:27:30 crc kubenswrapper[4764]: I1204 02:27:30.190573 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_96b866f4-5df7-4e21-96a9-f50903969fde/memcached/0.log" Dec 04 02:27:30 crc kubenswrapper[4764]: I1204 02:27:30.771157 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62f1b02a-92d6-496e-b922-2de70fae0f9a/rabbitmq/0.log" Dec 04 02:27:30 crc kubenswrapper[4764]: I1204 02:27:30.778341 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_62f1b02a-92d6-496e-b922-2de70fae0f9a/setup-container/0.log" Dec 04 02:27:30 crc kubenswrapper[4764]: I1204 02:27:30.887813 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5633760a-81af-4a8e-a1d3-9eff9c06ca40/setup-container/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.057482 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5633760a-81af-4a8e-a1d3-9eff9c06ca40/setup-container/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.117970 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_5633760a-81af-4a8e-a1d3-9eff9c06ca40/rabbitmq/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.214591 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-l2n82_fb653e7a-4650-4b7c-a875-6773b4db51c6/reboot-os-openstack-openstack-cell1/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.314902 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-5f8s8_17743066-ad51-4fa0-ad0b-27b20e412a5a/run-os-openstack-openstack-cell1/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.454428 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-585vt_c6d34e1b-7b2f-4302-b01c-a8c2e1fd2020/ssh-known-hosts-openstack/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.601872 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-t7qqm_3e94945c-c71a-4fd7-95d1-8609b7bc068e/telemetry-openstack-openstack-cell1/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.672038 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-qjffw_4336186f-bad0-463f-8133-1f6d260ab27f/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 04 02:27:31 crc kubenswrapper[4764]: I1204 02:27:31.816517 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-m4rkn_2bbe80b3-f8b0-4197-9836-34847231fe93/validate-network-openstack-openstack-cell1/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.305816 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/util/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.518530 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/util/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.521407 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/pull/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.577199 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/pull/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.744003 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/pull/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.761430 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/util/0.log" Dec 04 02:27:57 crc kubenswrapper[4764]: I1204 02:27:57.801350 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d2206864srrnp_db4fd273-0b92-483d-b8c6-5f558f260a3e/extract/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.003983 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7kt8t_94c55b91-cbc9-47c5-8abc-5140aeebf8d0/kube-rbac-proxy/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.032374 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-bvqgr_ad7f7e9e-482b-415c-bf8d-02c9efbe387d/kube-rbac-proxy/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.047446 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-7kt8t_94c55b91-cbc9-47c5-8abc-5140aeebf8d0/manager/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.265093 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-bvqgr_ad7f7e9e-482b-415c-bf8d-02c9efbe387d/manager/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.299188 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q2lz5_4882c949-7a46-408b-a5ee-fc0fdcc6b291/manager/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.326054 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-q2lz5_4882c949-7a46-408b-a5ee-fc0fdcc6b291/kube-rbac-proxy/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.563179 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qxhzq_06f61b54-9d32-467a-be0b-07f8fdf867aa/kube-rbac-proxy/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.693997 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-qxhzq_06f61b54-9d32-467a-be0b-07f8fdf867aa/manager/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.768117 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-k66k8_f2ff43e5-0f12-4008-a286-e6872cf78923/kube-rbac-proxy/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.865538 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-k66k8_f2ff43e5-0f12-4008-a286-e6872cf78923/manager/0.log" Dec 04 02:27:58 crc kubenswrapper[4764]: I1204 02:27:58.947247 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-nt8s2_37b7d1f8-a668-44e9-af8d-0be7555bf2f6/kube-rbac-proxy/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.005509 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-nt8s2_37b7d1f8-a668-44e9-af8d-0be7555bf2f6/manager/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.115816 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-nt8f5_4e494874-aa22-4cbb-aef2-a20b3ad2eea3/kube-rbac-proxy/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.328350 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-dt6f8_3178ac5b-9384-4653-bbc2-713a718eac88/kube-rbac-proxy/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.391656 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-dt6f8_3178ac5b-9384-4653-bbc2-713a718eac88/manager/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.428071 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-nt8f5_4e494874-aa22-4cbb-aef2-a20b3ad2eea3/manager/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.533825 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gp7h7_ebef9632-34f5-48ce-9a64-c76cf619498e/kube-rbac-proxy/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.765170 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-qwfd7_c4162b99-a9c7-471c-a408-058ffb74fe69/kube-rbac-proxy/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.800880 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-qwfd7_c4162b99-a9c7-471c-a408-058ffb74fe69/manager/0.log" Dec 04 02:27:59 crc kubenswrapper[4764]: I1204 02:27:59.823836 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-gp7h7_ebef9632-34f5-48ce-9a64-c76cf619498e/manager/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.012274 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-xrd58_984c6845-3698-46c3-9d88-416635322b98/kube-rbac-proxy/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.065498 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-xrd58_984c6845-3698-46c3-9d88-416635322b98/manager/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.130386 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vglgf_64d40f74-d89f-4e84-bb91-ff8cdcfdc747/kube-rbac-proxy/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.310392 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vglgf_64d40f74-d89f-4e84-bb91-ff8cdcfdc747/manager/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.319574 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-975dt_2ec14efa-ac80-45f6-bcd2-20b404087776/kube-rbac-proxy/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.511113 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-975dt_2ec14efa-ac80-45f6-bcd2-20b404087776/manager/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.570846 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zj7br_e69ad255-4b3d-4c49-ad8a-59850f846c00/kube-rbac-proxy/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.642058 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-zj7br_e69ad255-4b3d-4c49-ad8a-59850f846c00/manager/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.756768 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55d86b6686p2kgq_f18b5092-5d70-482a-af1f-be661a68701e/kube-rbac-proxy/0.log" Dec 04 02:28:00 crc kubenswrapper[4764]: I1204 02:28:00.761523 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55d86b6686p2kgq_f18b5092-5d70-482a-af1f-be661a68701e/manager/0.log" Dec 04 02:28:01 crc kubenswrapper[4764]: I1204 02:28:01.933247 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7dd5c7bb7c-b5s5w_a69b6c88-819a-444b-8eff-4b629d4d4c87/operator/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.006567 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ph7dn_c632a187-d000-4581-bdbb-9f0f47e448cc/registry-server/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.040871 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2hmdx_2ccf4351-3ae2-432d-ae11-1a07dab689ae/kube-rbac-proxy/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.243284 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-k5sms_ee4f58e7-bb44-483d-9ab4-c1e447c5e68c/kube-rbac-proxy/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.288253 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2hmdx_2ccf4351-3ae2-432d-ae11-1a07dab689ae/manager/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.332701 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-k5sms_ee4f58e7-bb44-483d-9ab4-c1e447c5e68c/manager/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.494429 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-p7ptg_d4dea3da-3ffa-4da1-9a93-5f3233112a23/operator/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.644258 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vzgmn_0f640363-0b23-4625-bf4d-2829b924640d/kube-rbac-proxy/0.log" Dec 04 02:28:02 crc kubenswrapper[4764]: I1204 02:28:02.693709 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-vzgmn_0f640363-0b23-4625-bf4d-2829b924640d/manager/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.280675 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-fj6g2_d703896f-745b-4359-8d2a-2c4b7cf5d062/kube-rbac-proxy/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.281994 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-gd8dc_247a518a-17e9-482b-bab7-832b31fa99e1/kube-rbac-proxy/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.485436 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-s2rm5_ae29fd3c-f586-427c-a482-fdc2f609aa25/kube-rbac-proxy/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.488680 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-gd8dc_247a518a-17e9-482b-bab7-832b31fa99e1/manager/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.489103 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-fj6g2_d703896f-745b-4359-8d2a-2c4b7cf5d062/manager/0.log" Dec 04 02:28:03 crc kubenswrapper[4764]: I1204 02:28:03.684343 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-s2rm5_ae29fd3c-f586-427c-a482-fdc2f609aa25/manager/0.log" Dec 04 02:28:04 crc kubenswrapper[4764]: I1204 02:28:04.054255 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-9f56fc979-6ssgm_7c88c833-c710-44c7-9bfb-a684a7f39c39/manager/0.log" Dec 04 02:28:24 crc kubenswrapper[4764]: I1204 02:28:24.945869 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fnct7_6669d173-9f6e-49d9-8159-5c0406eedac9/control-plane-machine-set-operator/0.log" Dec 04 02:28:25 crc kubenswrapper[4764]: I1204 02:28:25.125368 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vlqkp_d4a5d117-9aa6-4b48-8862-2be01934454a/kube-rbac-proxy/0.log" Dec 04 02:28:25 crc kubenswrapper[4764]: I1204 02:28:25.190565 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-vlqkp_d4a5d117-9aa6-4b48-8862-2be01934454a/machine-api-operator/0.log" Dec 04 02:28:39 crc kubenswrapper[4764]: I1204 02:28:39.349558 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-p8rfr_ba1a186a-006d-4fa1-a90c-48d4220661ba/cert-manager-controller/0.log" Dec 04 02:28:39 crc kubenswrapper[4764]: I1204 02:28:39.497543 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-vjvpk_5aa91780-e991-4b90-abba-bcdeba0898d4/cert-manager-cainjector/0.log" Dec 04 02:28:39 crc kubenswrapper[4764]: I1204 02:28:39.568499 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-g74lr_5363aea9-b6c1-4b4b-9446-30b2bba729cc/cert-manager-webhook/0.log" Dec 04 02:28:50 crc kubenswrapper[4764]: I1204 02:28:50.868579 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:28:50 crc kubenswrapper[4764]: I1204 02:28:50.869465 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.017924 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-phnlh_760538af-8a9b-4c70-9743-0ebd1799e6f4/nmstate-console-plugin/0.log" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.305697 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-rdwmt_ef6a3342-1957-4648-9026-0c14fd0589d4/nmstate-handler/0.log" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.453889 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-z7lsn_e5bd5023-35b9-4646-b974-063e5f048d1b/kube-rbac-proxy/0.log" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.522757 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-z7lsn_e5bd5023-35b9-4646-b974-063e5f048d1b/nmstate-metrics/0.log" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.574820 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-wxdrg_a3ba2f28-5fef-4cce-bd05-34da304861e9/nmstate-operator/0.log" Dec 04 02:28:53 crc kubenswrapper[4764]: I1204 02:28:53.680630 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-4nm7p_7fb73b2d-5404-4abe-8b57-92a14a48b7ec/nmstate-webhook/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.439011 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nqrkc_6e46fa91-e07b-4b3a-97fd-e1aa7608eb87/kube-rbac-proxy/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.660120 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-frr-files/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.842547 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-nqrkc_6e46fa91-e07b-4b3a-97fd-e1aa7608eb87/controller/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.920958 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-frr-files/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.965742 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-metrics/0.log" Dec 04 02:29:12 crc kubenswrapper[4764]: I1204 02:29:12.977775 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-reloader/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.039761 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-reloader/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.260784 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-metrics/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.278353 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-reloader/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.281112 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-metrics/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.290077 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-frr-files/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.476493 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-reloader/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.478385 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-frr-files/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.491381 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/cp-metrics/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.498292 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/controller/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.690753 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/kube-rbac-proxy-frr/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.699869 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/frr-metrics/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.752209 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/kube-rbac-proxy/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.860997 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/reloader/0.log" Dec 04 02:29:13 crc kubenswrapper[4764]: I1204 02:29:13.997653 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-g975j_049d1003-dccf-474e-8d17-359aa1ae6d95/frr-k8s-webhook-server/0.log" Dec 04 02:29:14 crc kubenswrapper[4764]: I1204 02:29:14.274226 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5bb48b4db7-29gpr_f588cdd9-3f46-470b-9c63-9de8eab25f1a/manager/0.log" Dec 04 02:29:14 crc kubenswrapper[4764]: I1204 02:29:14.365607 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7769dfdc9d-slqg9_cbfc441b-442f-4232-85e4-51ab089ea1d9/webhook-server/0.log" Dec 04 02:29:14 crc kubenswrapper[4764]: I1204 02:29:14.472451 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6gtd_086eaaf6-7fad-4866-b78a-4123a4f6e9a1/kube-rbac-proxy/0.log" Dec 04 02:29:15 crc kubenswrapper[4764]: I1204 02:29:15.422677 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-t6gtd_086eaaf6-7fad-4866-b78a-4123a4f6e9a1/speaker/0.log" Dec 04 02:29:16 crc kubenswrapper[4764]: I1204 02:29:16.795695 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r56k9_56658724-6546-4715-8fa6-6997065dad38/frr/0.log" Dec 04 02:29:20 crc kubenswrapper[4764]: I1204 02:29:20.869158 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:29:20 crc kubenswrapper[4764]: I1204 02:29:20.869891 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.112846 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/util/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.341362 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/util/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.363689 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/pull/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.444853 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/pull/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.556191 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/util/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.598829 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/extract/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.651839 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931akfrq8_b1930095-9d98-42ce-bc7e-46ac75742d43/pull/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.736904 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/util/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.935346 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/util/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.936858 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/pull/0.log" Dec 04 02:29:30 crc kubenswrapper[4764]: I1204 02:29:30.968293 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/pull/0.log" Dec 04 02:29:31 crc kubenswrapper[4764]: I1204 02:29:31.162983 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/pull/0.log" Dec 04 02:29:31 crc kubenswrapper[4764]: I1204 02:29:31.190644 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/extract/0.log" Dec 04 02:29:31 crc kubenswrapper[4764]: I1204 02:29:31.197485 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fkjwdx_d789b8c1-b96b-4809-8e37-1ca6b9b39adc/util/0.log" Dec 04 02:29:31 crc kubenswrapper[4764]: I1204 02:29:31.897921 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/util/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.105437 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/pull/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.127601 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/util/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.143039 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/pull/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.335564 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/util/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.381436 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/extract/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.405710 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92108xvtw_4dd7f558-826d-4e33-bf17-9021f28ce1e6/pull/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.580535 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/util/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.740356 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/pull/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.786621 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/pull/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.794314 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/util/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.977308 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/extract/0.log" Dec 04 02:29:32 crc kubenswrapper[4764]: I1204 02:29:32.980488 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/util/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.140459 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rbpqp_ddf82a6a-996e-42ed-b8e3-3ec8f6380323/pull/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.505274 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-utilities/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.691161 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-content/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.695899 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-utilities/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.745064 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-content/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.901435 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-utilities/0.log" Dec 04 02:29:33 crc kubenswrapper[4764]: I1204 02:29:33.965244 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-utilities/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.045045 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/extract-content/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.200637 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-content/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.244403 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-content/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.266118 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-utilities/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.441401 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-utilities/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.468703 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/extract-content/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.630998 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zl7q7_b1e53002-80dd-456b-8da8-e7dc634450d8/marketplace-operator/0.log" Dec 04 02:29:34 crc kubenswrapper[4764]: I1204 02:29:34.786936 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-utilities/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.216247 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-content/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.235948 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-z65ht_8708be4a-2506-4174-b4a9-3e9627a6ce3c/registry-server/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.306377 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-utilities/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.318275 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-content/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.440510 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-utilities/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.488531 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/extract-content/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.651078 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-utilities/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.682176 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wq2db_26c5f969-84ca-4652-8914-d01b7bbac800/registry-server/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.869389 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-k95g9_c7065eff-afd9-444c-8830-c58d4c4702c9/registry-server/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.959772 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-content/0.log" Dec 04 02:29:35 crc kubenswrapper[4764]: I1204 02:29:35.966199 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-utilities/0.log" Dec 04 02:29:36 crc kubenswrapper[4764]: I1204 02:29:36.000279 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-content/0.log" Dec 04 02:29:36 crc kubenswrapper[4764]: I1204 02:29:36.153917 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-content/0.log" Dec 04 02:29:36 crc kubenswrapper[4764]: I1204 02:29:36.161627 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/extract-utilities/0.log" Dec 04 02:29:37 crc kubenswrapper[4764]: I1204 02:29:37.213177 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-75x4f_72e5bec5-5dac-4d3b-999f-864ccb0a7595/registry-server/0.log" Dec 04 02:29:49 crc kubenswrapper[4764]: I1204 02:29:49.536039 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-lfv25_cb33cd08-98e5-454a-85df-bf1f1c711c48/prometheus-operator/0.log" Dec 04 02:29:49 crc kubenswrapper[4764]: I1204 02:29:49.652045 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbc8ccfb5-chx8s_1a8ae582-b26e-4da5-9474-d2e049f6d86e/prometheus-operator-admission-webhook/0.log" Dec 04 02:29:49 crc kubenswrapper[4764]: I1204 02:29:49.733235 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-bbc8ccfb5-vt87k_8490b392-1234-48b9-8522-d7e07fca695d/prometheus-operator-admission-webhook/0.log" Dec 04 02:29:49 crc kubenswrapper[4764]: I1204 02:29:49.858964 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-cztpp_d44a2888-4635-4cac-a1d4-f68fd374072f/operator/0.log" Dec 04 02:29:49 crc kubenswrapper[4764]: I1204 02:29:49.939267 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-5d9jn_ed402497-6092-40f1-912c-6c7d59ef70f2/perses-operator/0.log" Dec 04 02:29:50 crc kubenswrapper[4764]: I1204 02:29:50.869278 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:29:50 crc kubenswrapper[4764]: I1204 02:29:50.869564 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:29:50 crc kubenswrapper[4764]: I1204 02:29:50.869605 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:29:50 crc kubenswrapper[4764]: I1204 02:29:50.870059 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:29:50 crc kubenswrapper[4764]: I1204 02:29:50.870103 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b" gracePeriod=600 Dec 04 02:29:51 crc kubenswrapper[4764]: I1204 02:29:51.909432 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b" exitCode=0 Dec 04 02:29:51 crc kubenswrapper[4764]: I1204 02:29:51.909495 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b"} Dec 04 02:29:51 crc kubenswrapper[4764]: I1204 02:29:51.910014 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerStarted","Data":"9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19"} Dec 04 02:29:51 crc kubenswrapper[4764]: I1204 02:29:51.910034 4764 scope.go:117] "RemoveContainer" containerID="ab0ec80a5c15185c557e2d9c6fe454aaa5c3c3fa30579ed847138e1cf4825000" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.954616 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:29:56 crc kubenswrapper[4764]: E1204 02:29:56.955447 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="registry-server" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.955462 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="registry-server" Dec 04 02:29:56 crc kubenswrapper[4764]: E1204 02:29:56.955507 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="extract-utilities" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.955514 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="extract-utilities" Dec 04 02:29:56 crc kubenswrapper[4764]: E1204 02:29:56.955528 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="extract-content" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.955534 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="extract-content" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.955919 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="65122f14-7baf-4eb0-ada8-ac234b6dda16" containerName="registry-server" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.957446 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:56 crc kubenswrapper[4764]: I1204 02:29:56.974871 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.023227 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.023284 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gsv\" (UniqueName: \"kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.023398 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.125177 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.125477 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9gsv\" (UniqueName: \"kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.125643 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.125816 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.126159 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.147362 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9gsv\" (UniqueName: \"kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv\") pod \"certified-operators-8nq9n\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.278405 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.921138 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:29:57 crc kubenswrapper[4764]: I1204 02:29:57.984160 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerStarted","Data":"7d06da9d475e2e99e29c6d3db8dbb49ffdeafa179e0cd33db95d82e9327ed41d"} Dec 04 02:29:58 crc kubenswrapper[4764]: I1204 02:29:58.993495 4764 generic.go:334] "Generic (PLEG): container finished" podID="96e45107-642d-4607-abec-a7684d84f0f9" containerID="24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf" exitCode=0 Dec 04 02:29:58 crc kubenswrapper[4764]: I1204 02:29:58.993535 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerDied","Data":"24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf"} Dec 04 02:29:58 crc kubenswrapper[4764]: I1204 02:29:58.996867 4764 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.004000 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerStarted","Data":"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4"} Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.177914 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57"] Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.180034 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.183872 4764 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.184150 4764 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.205231 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57"] Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.293831 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.293889 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.293926 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzvkn\" (UniqueName: \"kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.395975 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.396017 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.396039 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzvkn\" (UniqueName: \"kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.397423 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.405336 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.417784 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzvkn\" (UniqueName: \"kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn\") pod \"collect-profiles-29413590-h5d57\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.507624 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:00 crc kubenswrapper[4764]: I1204 02:30:00.861486 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57"] Dec 04 02:30:01 crc kubenswrapper[4764]: I1204 02:30:01.028683 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" event={"ID":"ae5218dc-4f2b-4630-a7f9-bd640183b1a4","Type":"ContainerStarted","Data":"a1adb9e95e0e94c6e13f972deb1a37440c1b20c020b1f33fc77a97ae5ce0b257"} Dec 04 02:30:01 crc kubenswrapper[4764]: I1204 02:30:01.043971 4764 generic.go:334] "Generic (PLEG): container finished" podID="96e45107-642d-4607-abec-a7684d84f0f9" containerID="a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4" exitCode=0 Dec 04 02:30:01 crc kubenswrapper[4764]: I1204 02:30:01.044012 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerDied","Data":"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4"} Dec 04 02:30:02 crc kubenswrapper[4764]: I1204 02:30:02.066252 4764 generic.go:334] "Generic (PLEG): container finished" podID="ae5218dc-4f2b-4630-a7f9-bd640183b1a4" containerID="044a842bfeeed713251ab87e3914012be6145dd1590289b268fbed6d403b353d" exitCode=0 Dec 04 02:30:02 crc kubenswrapper[4764]: I1204 02:30:02.066331 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" event={"ID":"ae5218dc-4f2b-4630-a7f9-bd640183b1a4","Type":"ContainerDied","Data":"044a842bfeeed713251ab87e3914012be6145dd1590289b268fbed6d403b353d"} Dec 04 02:30:02 crc kubenswrapper[4764]: I1204 02:30:02.080163 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerStarted","Data":"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c"} Dec 04 02:30:02 crc kubenswrapper[4764]: I1204 02:30:02.119764 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8nq9n" podStartSLOduration=3.587679525 podStartE2EDuration="6.119740753s" podCreationTimestamp="2025-12-04 02:29:56 +0000 UTC" firstStartedPulling="2025-12-04 02:29:58.996637274 +0000 UTC m=+10134.757961685" lastFinishedPulling="2025-12-04 02:30:01.528698502 +0000 UTC m=+10137.290022913" observedRunningTime="2025-12-04 02:30:02.106089588 +0000 UTC m=+10137.867414019" watchObservedRunningTime="2025-12-04 02:30:02.119740753 +0000 UTC m=+10137.881065164" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.517365 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.581781 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume\") pod \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.581913 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzvkn\" (UniqueName: \"kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn\") pod \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.582050 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume\") pod \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\" (UID: \"ae5218dc-4f2b-4630-a7f9-bd640183b1a4\") " Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.583952 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "ae5218dc-4f2b-4630-a7f9-bd640183b1a4" (UID: "ae5218dc-4f2b-4630-a7f9-bd640183b1a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.593084 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn" (OuterVolumeSpecName: "kube-api-access-zzvkn") pod "ae5218dc-4f2b-4630-a7f9-bd640183b1a4" (UID: "ae5218dc-4f2b-4630-a7f9-bd640183b1a4"). InnerVolumeSpecName "kube-api-access-zzvkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.603467 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ae5218dc-4f2b-4630-a7f9-bd640183b1a4" (UID: "ae5218dc-4f2b-4630-a7f9-bd640183b1a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.683701 4764 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.683743 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzvkn\" (UniqueName: \"kubernetes.io/projected/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-kube-api-access-zzvkn\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:03 crc kubenswrapper[4764]: I1204 02:30:03.683755 4764 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ae5218dc-4f2b-4630-a7f9-bd640183b1a4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:04 crc kubenswrapper[4764]: I1204 02:30:04.104791 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" event={"ID":"ae5218dc-4f2b-4630-a7f9-bd640183b1a4","Type":"ContainerDied","Data":"a1adb9e95e0e94c6e13f972deb1a37440c1b20c020b1f33fc77a97ae5ce0b257"} Dec 04 02:30:04 crc kubenswrapper[4764]: I1204 02:30:04.104838 4764 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1adb9e95e0e94c6e13f972deb1a37440c1b20c020b1f33fc77a97ae5ce0b257" Dec 04 02:30:04 crc kubenswrapper[4764]: I1204 02:30:04.104897 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413590-h5d57" Dec 04 02:30:04 crc kubenswrapper[4764]: I1204 02:30:04.634809 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4"] Dec 04 02:30:04 crc kubenswrapper[4764]: I1204 02:30:04.650177 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413545-qcdk4"] Dec 04 02:30:06 crc kubenswrapper[4764]: I1204 02:30:06.567946 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df1ac17-5d6f-4160-aec8-2fed933e366c" path="/var/lib/kubelet/pods/7df1ac17-5d6f-4160-aec8-2fed933e366c/volumes" Dec 04 02:30:07 crc kubenswrapper[4764]: I1204 02:30:07.279342 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:07 crc kubenswrapper[4764]: I1204 02:30:07.279692 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:07 crc kubenswrapper[4764]: I1204 02:30:07.342048 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:08 crc kubenswrapper[4764]: I1204 02:30:08.239526 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:08 crc kubenswrapper[4764]: I1204 02:30:08.316230 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.192567 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8nq9n" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="registry-server" containerID="cri-o://ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c" gracePeriod=2 Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.819119 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.841393 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9gsv\" (UniqueName: \"kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv\") pod \"96e45107-642d-4607-abec-a7684d84f0f9\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.841541 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content\") pod \"96e45107-642d-4607-abec-a7684d84f0f9\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.841659 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities\") pod \"96e45107-642d-4607-abec-a7684d84f0f9\" (UID: \"96e45107-642d-4607-abec-a7684d84f0f9\") " Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.847147 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities" (OuterVolumeSpecName: "utilities") pod "96e45107-642d-4607-abec-a7684d84f0f9" (UID: "96e45107-642d-4607-abec-a7684d84f0f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.853130 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv" (OuterVolumeSpecName: "kube-api-access-r9gsv") pod "96e45107-642d-4607-abec-a7684d84f0f9" (UID: "96e45107-642d-4607-abec-a7684d84f0f9"). InnerVolumeSpecName "kube-api-access-r9gsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.898023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96e45107-642d-4607-abec-a7684d84f0f9" (UID: "96e45107-642d-4607-abec-a7684d84f0f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.944933 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9gsv\" (UniqueName: \"kubernetes.io/projected/96e45107-642d-4607-abec-a7684d84f0f9-kube-api-access-r9gsv\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.944966 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:10 crc kubenswrapper[4764]: I1204 02:30:10.944980 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96e45107-642d-4607-abec-a7684d84f0f9-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.201502 4764 generic.go:334] "Generic (PLEG): container finished" podID="96e45107-642d-4607-abec-a7684d84f0f9" containerID="ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c" exitCode=0 Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.201541 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerDied","Data":"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c"} Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.201567 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8nq9n" event={"ID":"96e45107-642d-4607-abec-a7684d84f0f9","Type":"ContainerDied","Data":"7d06da9d475e2e99e29c6d3db8dbb49ffdeafa179e0cd33db95d82e9327ed41d"} Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.201582 4764 scope.go:117] "RemoveContainer" containerID="ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.201695 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8nq9n" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.245705 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.248011 4764 scope.go:117] "RemoveContainer" containerID="a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.260575 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8nq9n"] Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.278516 4764 scope.go:117] "RemoveContainer" containerID="24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.324872 4764 scope.go:117] "RemoveContainer" containerID="ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c" Dec 04 02:30:11 crc kubenswrapper[4764]: E1204 02:30:11.325334 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c\": container with ID starting with ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c not found: ID does not exist" containerID="ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.325364 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c"} err="failed to get container status \"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c\": rpc error: code = NotFound desc = could not find container \"ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c\": container with ID starting with ccb964c57e58c7814a7e95b25248bbac92f97dd8541ae8cba6b3d53952d1c33c not found: ID does not exist" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.325384 4764 scope.go:117] "RemoveContainer" containerID="a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4" Dec 04 02:30:11 crc kubenswrapper[4764]: E1204 02:30:11.326239 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4\": container with ID starting with a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4 not found: ID does not exist" containerID="a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.326261 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4"} err="failed to get container status \"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4\": rpc error: code = NotFound desc = could not find container \"a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4\": container with ID starting with a76a4d87b00eca7900e36644e0ca3594889448ec8a4b653d08bfe36344b7e0c4 not found: ID does not exist" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.326273 4764 scope.go:117] "RemoveContainer" containerID="24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf" Dec 04 02:30:11 crc kubenswrapper[4764]: E1204 02:30:11.326709 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf\": container with ID starting with 24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf not found: ID does not exist" containerID="24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf" Dec 04 02:30:11 crc kubenswrapper[4764]: I1204 02:30:11.326745 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf"} err="failed to get container status \"24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf\": rpc error: code = NotFound desc = could not find container \"24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf\": container with ID starting with 24284435bb98c30e819448360feb4d0fde2e5a23c31c7b0b1e01639cc88160cf not found: ID does not exist" Dec 04 02:30:12 crc kubenswrapper[4764]: I1204 02:30:12.556611 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e45107-642d-4607-abec-a7684d84f0f9" path="/var/lib/kubelet/pods/96e45107-642d-4607-abec-a7684d84f0f9/volumes" Dec 04 02:30:15 crc kubenswrapper[4764]: E1204 02:30:15.363757 4764 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.13:58920->38.102.83.13:39483: write tcp 38.102.83.13:58920->38.102.83.13:39483: write: broken pipe Dec 04 02:30:24 crc kubenswrapper[4764]: I1204 02:30:24.380526 4764 scope.go:117] "RemoveContainer" containerID="6b045b940c317695064f73cb729fee28dc62c01d9144db2a985ab66c7050c936" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.077964 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:33 crc kubenswrapper[4764]: E1204 02:30:33.079550 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5218dc-4f2b-4630-a7f9-bd640183b1a4" containerName="collect-profiles" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.079577 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5218dc-4f2b-4630-a7f9-bd640183b1a4" containerName="collect-profiles" Dec 04 02:30:33 crc kubenswrapper[4764]: E1204 02:30:33.079634 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="extract-utilities" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.079647 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="extract-utilities" Dec 04 02:30:33 crc kubenswrapper[4764]: E1204 02:30:33.079672 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="extract-content" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.079685 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="extract-content" Dec 04 02:30:33 crc kubenswrapper[4764]: E1204 02:30:33.079753 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="registry-server" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.079766 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="registry-server" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.080199 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e45107-642d-4607-abec-a7684d84f0f9" containerName="registry-server" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.080224 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5218dc-4f2b-4630-a7f9-bd640183b1a4" containerName="collect-profiles" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.083186 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.093254 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.093409 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzd8\" (UniqueName: \"kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.093447 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.113651 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.196247 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.196327 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzd8\" (UniqueName: \"kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.196351 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.196920 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.197161 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.216567 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzd8\" (UniqueName: \"kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8\") pod \"redhat-operators-d6d82\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.426108 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:33 crc kubenswrapper[4764]: I1204 02:30:33.978800 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:34 crc kubenswrapper[4764]: I1204 02:30:34.486705 4764 generic.go:334] "Generic (PLEG): container finished" podID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerID="97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1" exitCode=0 Dec 04 02:30:34 crc kubenswrapper[4764]: I1204 02:30:34.486758 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerDied","Data":"97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1"} Dec 04 02:30:34 crc kubenswrapper[4764]: I1204 02:30:34.486961 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerStarted","Data":"92f928bf4671ea1836bdce2bf25cba80e09d97f8f7a9136abe82d618202e04b8"} Dec 04 02:30:35 crc kubenswrapper[4764]: I1204 02:30:35.503799 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerStarted","Data":"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82"} Dec 04 02:30:39 crc kubenswrapper[4764]: I1204 02:30:39.557084 4764 generic.go:334] "Generic (PLEG): container finished" podID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerID="9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82" exitCode=0 Dec 04 02:30:39 crc kubenswrapper[4764]: I1204 02:30:39.557167 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerDied","Data":"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82"} Dec 04 02:30:40 crc kubenswrapper[4764]: I1204 02:30:40.577438 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerStarted","Data":"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d"} Dec 04 02:30:40 crc kubenswrapper[4764]: I1204 02:30:40.610254 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d6d82" podStartSLOduration=2.116492246 podStartE2EDuration="7.610236798s" podCreationTimestamp="2025-12-04 02:30:33 +0000 UTC" firstStartedPulling="2025-12-04 02:30:34.48817445 +0000 UTC m=+10170.249498851" lastFinishedPulling="2025-12-04 02:30:39.981918952 +0000 UTC m=+10175.743243403" observedRunningTime="2025-12-04 02:30:40.607789398 +0000 UTC m=+10176.369113829" watchObservedRunningTime="2025-12-04 02:30:40.610236798 +0000 UTC m=+10176.371561219" Dec 04 02:30:43 crc kubenswrapper[4764]: I1204 02:30:43.426703 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:43 crc kubenswrapper[4764]: I1204 02:30:43.428519 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:44 crc kubenswrapper[4764]: I1204 02:30:44.492478 4764 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d6d82" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="registry-server" probeResult="failure" output=< Dec 04 02:30:44 crc kubenswrapper[4764]: timeout: failed to connect service ":50051" within 1s Dec 04 02:30:44 crc kubenswrapper[4764]: > Dec 04 02:30:53 crc kubenswrapper[4764]: I1204 02:30:53.484323 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:53 crc kubenswrapper[4764]: I1204 02:30:53.534010 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:53 crc kubenswrapper[4764]: I1204 02:30:53.722557 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:54 crc kubenswrapper[4764]: I1204 02:30:54.935235 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d6d82" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="registry-server" containerID="cri-o://eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d" gracePeriod=2 Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.528650 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.693856 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzd8\" (UniqueName: \"kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8\") pod \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.694656 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities\") pod \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.694860 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content\") pod \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\" (UID: \"c1f883fd-c143-43b7-9dbe-cdcda836d6f5\") " Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.696175 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities" (OuterVolumeSpecName: "utilities") pod "c1f883fd-c143-43b7-9dbe-cdcda836d6f5" (UID: "c1f883fd-c143-43b7-9dbe-cdcda836d6f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.696877 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.715452 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8" (OuterVolumeSpecName: "kube-api-access-2fzd8") pod "c1f883fd-c143-43b7-9dbe-cdcda836d6f5" (UID: "c1f883fd-c143-43b7-9dbe-cdcda836d6f5"). InnerVolumeSpecName "kube-api-access-2fzd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.798539 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzd8\" (UniqueName: \"kubernetes.io/projected/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-kube-api-access-2fzd8\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.836667 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1f883fd-c143-43b7-9dbe-cdcda836d6f5" (UID: "c1f883fd-c143-43b7-9dbe-cdcda836d6f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.900494 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1f883fd-c143-43b7-9dbe-cdcda836d6f5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.947131 4764 generic.go:334] "Generic (PLEG): container finished" podID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerID="eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d" exitCode=0 Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.947171 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerDied","Data":"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d"} Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.947196 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d6d82" event={"ID":"c1f883fd-c143-43b7-9dbe-cdcda836d6f5","Type":"ContainerDied","Data":"92f928bf4671ea1836bdce2bf25cba80e09d97f8f7a9136abe82d618202e04b8"} Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.947211 4764 scope.go:117] "RemoveContainer" containerID="eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.947308 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d6d82" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.983752 4764 scope.go:117] "RemoveContainer" containerID="9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82" Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.984194 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:55 crc kubenswrapper[4764]: I1204 02:30:55.994658 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d6d82"] Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.031701 4764 scope.go:117] "RemoveContainer" containerID="97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.071314 4764 scope.go:117] "RemoveContainer" containerID="eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d" Dec 04 02:30:56 crc kubenswrapper[4764]: E1204 02:30:56.071870 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d\": container with ID starting with eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d not found: ID does not exist" containerID="eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.071907 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d"} err="failed to get container status \"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d\": rpc error: code = NotFound desc = could not find container \"eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d\": container with ID starting with eea9889863509234476b7206aaff6af6fe2d5b339cafa94035b17ff0ee1c3c1d not found: ID does not exist" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.071935 4764 scope.go:117] "RemoveContainer" containerID="9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82" Dec 04 02:30:56 crc kubenswrapper[4764]: E1204 02:30:56.072291 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82\": container with ID starting with 9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82 not found: ID does not exist" containerID="9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.072324 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82"} err="failed to get container status \"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82\": rpc error: code = NotFound desc = could not find container \"9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82\": container with ID starting with 9ac3e6431d08bdda381052613c06c87fcf6bcb02e244d89f1edb0cc378a47f82 not found: ID does not exist" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.072343 4764 scope.go:117] "RemoveContainer" containerID="97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1" Dec 04 02:30:56 crc kubenswrapper[4764]: E1204 02:30:56.072656 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1\": container with ID starting with 97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1 not found: ID does not exist" containerID="97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.072678 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1"} err="failed to get container status \"97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1\": rpc error: code = NotFound desc = could not find container \"97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1\": container with ID starting with 97f1ca3c189324c3c170184dd6a5006bbaeee591c054c774ca51dfe3a1b614e1 not found: ID does not exist" Dec 04 02:30:56 crc kubenswrapper[4764]: I1204 02:30:56.579354 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" path="/var/lib/kubelet/pods/c1f883fd-c143-43b7-9dbe-cdcda836d6f5/volumes" Dec 04 02:31:56 crc kubenswrapper[4764]: I1204 02:31:56.766567 4764 generic.go:334] "Generic (PLEG): container finished" podID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerID="f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51" exitCode=0 Dec 04 02:31:56 crc kubenswrapper[4764]: I1204 02:31:56.766658 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-q6sff/must-gather-9dkzh" event={"ID":"9fb45c90-358b-40d5-9554-c3f6c445ec83","Type":"ContainerDied","Data":"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51"} Dec 04 02:31:56 crc kubenswrapper[4764]: I1204 02:31:56.769622 4764 scope.go:117] "RemoveContainer" containerID="f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51" Dec 04 02:31:57 crc kubenswrapper[4764]: I1204 02:31:57.432266 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q6sff_must-gather-9dkzh_9fb45c90-358b-40d5-9554-c3f6c445ec83/gather/0.log" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.324046 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-q6sff/must-gather-9dkzh"] Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.324918 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-q6sff/must-gather-9dkzh" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="copy" containerID="cri-o://0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416" gracePeriod=2 Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.335644 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-q6sff/must-gather-9dkzh"] Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.805955 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q6sff_must-gather-9dkzh_9fb45c90-358b-40d5-9554-c3f6c445ec83/copy/0.log" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.810051 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.819427 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output\") pod \"9fb45c90-358b-40d5-9554-c3f6c445ec83\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.819498 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf6t8\" (UniqueName: \"kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8\") pod \"9fb45c90-358b-40d5-9554-c3f6c445ec83\" (UID: \"9fb45c90-358b-40d5-9554-c3f6c445ec83\") " Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.830597 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8" (OuterVolumeSpecName: "kube-api-access-kf6t8") pod "9fb45c90-358b-40d5-9554-c3f6c445ec83" (UID: "9fb45c90-358b-40d5-9554-c3f6c445ec83"). InnerVolumeSpecName "kube-api-access-kf6t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.877850 4764 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-q6sff_must-gather-9dkzh_9fb45c90-358b-40d5-9554-c3f6c445ec83/copy/0.log" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.878198 4764 generic.go:334] "Generic (PLEG): container finished" podID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerID="0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416" exitCode=143 Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.878258 4764 scope.go:117] "RemoveContainer" containerID="0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.878409 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-q6sff/must-gather-9dkzh" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.915157 4764 scope.go:117] "RemoveContainer" containerID="f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.921548 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf6t8\" (UniqueName: \"kubernetes.io/projected/9fb45c90-358b-40d5-9554-c3f6c445ec83-kube-api-access-kf6t8\") on node \"crc\" DevicePath \"\"" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.953058 4764 scope.go:117] "RemoveContainer" containerID="0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416" Dec 04 02:32:05 crc kubenswrapper[4764]: E1204 02:32:05.953420 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416\": container with ID starting with 0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416 not found: ID does not exist" containerID="0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.953459 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416"} err="failed to get container status \"0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416\": rpc error: code = NotFound desc = could not find container \"0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416\": container with ID starting with 0199811fc4a70ed4be52eece915f6713018492a06298950bd066abbfba9a3416 not found: ID does not exist" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.953486 4764 scope.go:117] "RemoveContainer" containerID="f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51" Dec 04 02:32:05 crc kubenswrapper[4764]: E1204 02:32:05.953690 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51\": container with ID starting with f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51 not found: ID does not exist" containerID="f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51" Dec 04 02:32:05 crc kubenswrapper[4764]: I1204 02:32:05.953757 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51"} err="failed to get container status \"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51\": rpc error: code = NotFound desc = could not find container \"f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51\": container with ID starting with f5e5576070611f7b5a11ea84f2753740b00b75b279bbffa4ef58fc2616103d51 not found: ID does not exist" Dec 04 02:32:06 crc kubenswrapper[4764]: I1204 02:32:06.010704 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9fb45c90-358b-40d5-9554-c3f6c445ec83" (UID: "9fb45c90-358b-40d5-9554-c3f6c445ec83"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:32:06 crc kubenswrapper[4764]: I1204 02:32:06.022870 4764 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9fb45c90-358b-40d5-9554-c3f6c445ec83-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 02:32:06 crc kubenswrapper[4764]: I1204 02:32:06.605560 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" path="/var/lib/kubelet/pods/9fb45c90-358b-40d5-9554-c3f6c445ec83/volumes" Dec 04 02:32:20 crc kubenswrapper[4764]: I1204 02:32:20.871277 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:32:20 crc kubenswrapper[4764]: I1204 02:32:20.872096 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:32:50 crc kubenswrapper[4764]: I1204 02:32:50.868534 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:32:50 crc kubenswrapper[4764]: I1204 02:32:50.869311 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.254115 4764 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:32:51 crc kubenswrapper[4764]: E1204 02:32:51.254908 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="extract-utilities" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.254928 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="extract-utilities" Dec 04 02:32:51 crc kubenswrapper[4764]: E1204 02:32:51.254956 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="gather" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.254964 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="gather" Dec 04 02:32:51 crc kubenswrapper[4764]: E1204 02:32:51.254983 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="registry-server" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.254991 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="registry-server" Dec 04 02:32:51 crc kubenswrapper[4764]: E1204 02:32:51.255023 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="copy" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.255032 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="copy" Dec 04 02:32:51 crc kubenswrapper[4764]: E1204 02:32:51.255058 4764 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="extract-content" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.255066 4764 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="extract-content" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.255901 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="gather" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.255971 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f883fd-c143-43b7-9dbe-cdcda836d6f5" containerName="registry-server" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.256040 4764 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb45c90-358b-40d5-9554-c3f6c445ec83" containerName="copy" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.259338 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.276146 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.306407 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.306503 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.306846 4764 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qtw\" (UniqueName: \"kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.409002 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.409054 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.409125 4764 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qtw\" (UniqueName: \"kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.409663 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.409678 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.441869 4764 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qtw\" (UniqueName: \"kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw\") pod \"community-operators-nbhxp\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:51 crc kubenswrapper[4764]: I1204 02:32:51.589022 4764 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:32:52 crc kubenswrapper[4764]: I1204 02:32:52.258934 4764 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:32:52 crc kubenswrapper[4764]: I1204 02:32:52.419933 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerStarted","Data":"d58a95d46d523b3871167be6df6d9dbdca00535e6a4a6afd8a1e0f2f2110077f"} Dec 04 02:32:53 crc kubenswrapper[4764]: I1204 02:32:53.430883 4764 generic.go:334] "Generic (PLEG): container finished" podID="6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" containerID="3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4" exitCode=0 Dec 04 02:32:53 crc kubenswrapper[4764]: I1204 02:32:53.431215 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerDied","Data":"3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4"} Dec 04 02:32:54 crc kubenswrapper[4764]: I1204 02:32:54.447380 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerStarted","Data":"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d"} Dec 04 02:32:55 crc kubenswrapper[4764]: I1204 02:32:55.466957 4764 generic.go:334] "Generic (PLEG): container finished" podID="6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" containerID="23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d" exitCode=0 Dec 04 02:32:55 crc kubenswrapper[4764]: I1204 02:32:55.467045 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerDied","Data":"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d"} Dec 04 02:32:56 crc kubenswrapper[4764]: I1204 02:32:56.480067 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerStarted","Data":"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216"} Dec 04 02:32:56 crc kubenswrapper[4764]: I1204 02:32:56.502693 4764 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nbhxp" podStartSLOduration=3.020362665 podStartE2EDuration="5.502676952s" podCreationTimestamp="2025-12-04 02:32:51 +0000 UTC" firstStartedPulling="2025-12-04 02:32:53.433090168 +0000 UTC m=+10309.194414579" lastFinishedPulling="2025-12-04 02:32:55.915404455 +0000 UTC m=+10311.676728866" observedRunningTime="2025-12-04 02:32:56.501160785 +0000 UTC m=+10312.262485186" watchObservedRunningTime="2025-12-04 02:32:56.502676952 +0000 UTC m=+10312.264001363" Dec 04 02:33:01 crc kubenswrapper[4764]: I1204 02:33:01.589300 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:01 crc kubenswrapper[4764]: I1204 02:33:01.589838 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:01 crc kubenswrapper[4764]: I1204 02:33:01.641039 4764 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:02 crc kubenswrapper[4764]: I1204 02:33:02.614401 4764 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:02 crc kubenswrapper[4764]: I1204 02:33:02.670096 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:33:04 crc kubenswrapper[4764]: I1204 02:33:04.570977 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nbhxp" podUID="6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" containerName="registry-server" containerID="cri-o://de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216" gracePeriod=2 Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.084036 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.242549 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities\") pod \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.243056 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qtw\" (UniqueName: \"kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw\") pod \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.243098 4764 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content\") pod \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\" (UID: \"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02\") " Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.243821 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities" (OuterVolumeSpecName: "utilities") pod "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" (UID: "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.270149 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw" (OuterVolumeSpecName: "kube-api-access-65qtw") pod "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" (UID: "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02"). InnerVolumeSpecName "kube-api-access-65qtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.292023 4764 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" (UID: "6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.345462 4764 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qtw\" (UniqueName: \"kubernetes.io/projected/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-kube-api-access-65qtw\") on node \"crc\" DevicePath \"\"" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.345497 4764 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.345507 4764 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.588167 4764 generic.go:334] "Generic (PLEG): container finished" podID="6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" containerID="de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216" exitCode=0 Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.588225 4764 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nbhxp" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.588240 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerDied","Data":"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216"} Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.588296 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nbhxp" event={"ID":"6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02","Type":"ContainerDied","Data":"d58a95d46d523b3871167be6df6d9dbdca00535e6a4a6afd8a1e0f2f2110077f"} Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.588334 4764 scope.go:117] "RemoveContainer" containerID="de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.624046 4764 scope.go:117] "RemoveContainer" containerID="23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d" Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.647688 4764 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:33:05 crc kubenswrapper[4764]: I1204 02:33:05.658998 4764 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nbhxp"] Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.082063 4764 scope.go:117] "RemoveContainer" containerID="3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.365628 4764 scope.go:117] "RemoveContainer" containerID="de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216" Dec 04 02:33:06 crc kubenswrapper[4764]: E1204 02:33:06.366485 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216\": container with ID starting with de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216 not found: ID does not exist" containerID="de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.366551 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216"} err="failed to get container status \"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216\": rpc error: code = NotFound desc = could not find container \"de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216\": container with ID starting with de5c924f6412afe05c8994692eebdd719eff752b98dff7b097fc7f4cae072216 not found: ID does not exist" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.366591 4764 scope.go:117] "RemoveContainer" containerID="23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d" Dec 04 02:33:06 crc kubenswrapper[4764]: E1204 02:33:06.367056 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d\": container with ID starting with 23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d not found: ID does not exist" containerID="23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.367095 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d"} err="failed to get container status \"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d\": rpc error: code = NotFound desc = could not find container \"23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d\": container with ID starting with 23c02ee3692a9542508995d6dff25a1113f183fab09a70160c3b667e4d65e78d not found: ID does not exist" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.367121 4764 scope.go:117] "RemoveContainer" containerID="3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4" Dec 04 02:33:06 crc kubenswrapper[4764]: E1204 02:33:06.367530 4764 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4\": container with ID starting with 3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4 not found: ID does not exist" containerID="3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.367597 4764 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4"} err="failed to get container status \"3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4\": rpc error: code = NotFound desc = could not find container \"3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4\": container with ID starting with 3af96ea457b2b69041ae356b1373a6b2d76caaef2bfc8047539eb54f5a88e5a4 not found: ID does not exist" Dec 04 02:33:06 crc kubenswrapper[4764]: I1204 02:33:06.569131 4764 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02" path="/var/lib/kubelet/pods/6dcbfd5d-f3a9-47d7-aff7-64e5949e9f02/volumes" Dec 04 02:33:20 crc kubenswrapper[4764]: I1204 02:33:20.869269 4764 patch_prober.go:28] interesting pod/machine-config-daemon-hpltl container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 02:33:20 crc kubenswrapper[4764]: I1204 02:33:20.869895 4764 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 02:33:20 crc kubenswrapper[4764]: I1204 02:33:20.869961 4764 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" Dec 04 02:33:20 crc kubenswrapper[4764]: I1204 02:33:20.871227 4764 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19"} pod="openshift-machine-config-operator/machine-config-daemon-hpltl" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 02:33:20 crc kubenswrapper[4764]: I1204 02:33:20.871383 4764 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerName="machine-config-daemon" containerID="cri-o://9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" gracePeriod=600 Dec 04 02:33:21 crc kubenswrapper[4764]: E1204 02:33:21.003576 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:33:21 crc kubenswrapper[4764]: I1204 02:33:21.776398 4764 generic.go:334] "Generic (PLEG): container finished" podID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" exitCode=0 Dec 04 02:33:21 crc kubenswrapper[4764]: I1204 02:33:21.776783 4764 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" event={"ID":"dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4","Type":"ContainerDied","Data":"9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19"} Dec 04 02:33:21 crc kubenswrapper[4764]: I1204 02:33:21.776932 4764 scope.go:117] "RemoveContainer" containerID="30e2e2ee18a6efe63ca5abfbf45ed7169376c0d67d98dc3282f66e3055425a5b" Dec 04 02:33:21 crc kubenswrapper[4764]: I1204 02:33:21.777598 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:33:21 crc kubenswrapper[4764]: E1204 02:33:21.778216 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:33:36 crc kubenswrapper[4764]: I1204 02:33:36.547147 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:33:36 crc kubenswrapper[4764]: E1204 02:33:36.548336 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:33:47 crc kubenswrapper[4764]: I1204 02:33:47.546202 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:33:47 crc kubenswrapper[4764]: E1204 02:33:47.547187 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:33:59 crc kubenswrapper[4764]: I1204 02:33:59.547535 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:33:59 crc kubenswrapper[4764]: E1204 02:33:59.548686 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:34:10 crc kubenswrapper[4764]: I1204 02:34:10.546424 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:34:10 crc kubenswrapper[4764]: E1204 02:34:10.547134 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:34:24 crc kubenswrapper[4764]: I1204 02:34:24.545667 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:34:24 crc kubenswrapper[4764]: E1204 02:34:24.546376 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:34:36 crc kubenswrapper[4764]: I1204 02:34:36.545602 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:34:36 crc kubenswrapper[4764]: E1204 02:34:36.546428 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:34:47 crc kubenswrapper[4764]: I1204 02:34:47.545469 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:34:47 crc kubenswrapper[4764]: E1204 02:34:47.546565 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:35:01 crc kubenswrapper[4764]: I1204 02:35:01.545956 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:35:01 crc kubenswrapper[4764]: E1204 02:35:01.547225 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4" Dec 04 02:35:13 crc kubenswrapper[4764]: I1204 02:35:13.547036 4764 scope.go:117] "RemoveContainer" containerID="9ff66c051d004548c2fbfce470b2c7ed7bf0610f7fd9db5889e61f397bc66b19" Dec 04 02:35:13 crc kubenswrapper[4764]: E1204 02:35:13.548505 4764 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hpltl_openshift-machine-config-operator(dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-hpltl" podUID="dd3dd2ae-2b58-4de0-8ebe-773d83ac87f4"